The Camera device type is implemented using two traits:
PushAvStreamTransport
,
which handles audio and video stream transport using push-based protocols, and
WebRtcLiveView
,
which provides the ability to control livestreams and talkback.
The Doorbell device type, for those implementations that have
camera capabilities, also uses these traits.
Home APIs Device Type | Traits | Kotlin Sample App | Use Case |
---|---|---|---|
Camera
A device that captures still images or video. Cameras may feature accessible livestreams, two-way talkback, or detection events. |
Required Traits google PushAvStreamTransport google WebRtcLiveView |
Camera | |
Doorbell
A device actuated by a button outside a door that makes an audible and/or visual signal, used to request the attention of a person who is somewhere on the other side of the door. Doorbells may feature accessible livestreams, two-way talkback, or detection events. |
Required Traits google PushAvStreamTransport google WebRtcLiveView |
Doorbell |
Start a livestream
To start a livestream, send the Session Description Protocol (SDP) string
to the
WebRtcLiveView
trait's
startLiveView()
method, which returns a
WebRtcLiveViewTrait.StartLiveViewCommand.Response
containing three values:
- The SDP for the session.
- The session duration in seconds.
- The session ID, which may be used to extend or terminate the session.
suspend fun getWebRtcLiveViewTrait(cameraDevice, cameraDeviceType) { return cameraDevice.type(cameraDeviceType).trait(WebRtcLiveView).first { it?.metadata?.sourceConnectivity?.connectivityState == ConnectivityState.ONLINE } } // Start the live view suspend fun startCameraStream(trait: WebRtcLiveView,offerSdp: String) { val response = trait.startLiveView(offerSdp) // Response contains three fields (see below) return response } ... // This is used to manage the WebRTC connection val peerConnection: RTCPeerConnection = ... ... val startResponse = startCameraStream(sdp) val answerSdp = startResponse?.answerSdp val sessionDuration = startResponse?.liveSessionDurationSeconds val mediaSessionId = startResponse?.mediaSessionId peerConnection.setRemoteDescription(SessionDescription.Type.ANSWER, answerSdp)
Extend a livestream
Livestreams have a preset duration after which they expire. To lengthen the
duration of an active stream, issue an extension request using the
WebRtcLiveView.extendLiveView()
method:
// Assuming camera stream has just been started suspend fun scheduleExtension(trait: WebRtcLiveView, mediaSessionId: String, liveSessionDurationSeconds: UShort ) { delay(liveSessionDurationSeconds - BUFFER_SECONDS * 1000) val response = trait.extendLiveView(mediaSessionId) // returns how long the session will be live for return response.liveSessionDurationSeconds }
Enable and disable recording capability
To enable the camera's recording capability, pass
TransportStatusEnum.Active
to the
PushAvStreamTransport
trait's
setTransportStatus()
method. To disable the recording capability, pass it
TransportStatusEnum.Inactive
.
In the following example, we wrap these calls in a single call that uses a
Boolean
to toggle the recording capability:
// Start or stop recording for all connections. suspend fun setCameraRecording(isOn: Boolean) { val pushAvStreamTransport = getPushAvStreamTransport if(isOn) { pushAvStreamTransport.setTransportStatus(TransportStatusEnum.Active) } else { pushAvStreamTransport.setTransportStatus(TransportStatusEnum.Inactive) } }
Check to see if recording capability is enabled
To determine if a camera's recording capability is enabled, check to see if any connections are active. The following example defines two functions to do this:
// Get the on/off state suspend fun onOffState(cameraDevice: HomeDevice, cameraDeviceType) { // Query the device for pushAvStreamTransport val pushAvTrait = getPushAvStreamTransport() return pushAvTrait.recordModeActive() } // Check to see if the camera's recording capability is enabled fun PushAvStreamTransport.recordModeActive(): Boolean { return currentConnections?.any { it.transportStatus == TransportStatusEnum.Active } ?: false }
Another way to check is using the findTransport()
function with a predicate:
// Fetch the current connections suspend fun queryRecordModeState(cameraDevice: HomeDevice, cameraDeviceType) { val pushAvStreamTransport = getPushAvStreamTransport() return pushAvStreamTransport.findTransport().let { it.transportConfigurations.any { it.transportStatus == TransportStatusEnum.Active } }
Start and stop talkback
To start talkback, call the
WebRtcLiveView
trait's startTalkback()
method. To stop, use
stopTalkback()
.
// Make sure camera stream is on suspend fun setTalkback(isOn: Boolean, trait: WebRtcLiveView, mediaSessionId: String) { if(isOn) { trait.startTalkback(mediaSessionId) } else { trait.stopTalkback(mediaSessionId) } }