1. Before you begin
The CameraStream
trait belongs to devices with the capability to stream video feeds to smart displays, Chromecast devices, and smartphones. The WebRTC protocol is now supported within the CameraStream
trait, which means that you can greatly reduce startup and streaming latency from a camera device to a Google Nest display device.
Prerequisites
- Review Cloud-to-cloud primer.
What you'll learn
- How to deploy a smart home cloud service.
- How to connect your service to Google Assistant.
- How to stream to a Google Nest display device with the WebRTC protocol.
What you'll need
- A web browser, such as Google Chrome.
- An iOS or Android device with the Google Home app.
- Node.js version 10.16 or higher.
- Blaze (pay-as-you-go) plan for Firebase.
- A built-in or external webcam device that can support full HD resolution.
- A Google Nest display device.
2. Get started
Install the Firebase CLI
The Firebase CLI lets you serve your web apps locally and deploy them to Firebase Hosting.
To install the Firebase CLI, follow these steps:
- In your terminal, download and install the Firebase CLI:
$ npm install -g firebase-tools
- Verify that the CLI installed correctly:
$ firebase --version
- Authorize the Firebase CLI with your Google Account:
$ firebase login
Create and configure an Actions project
- Go to the Actions console and then click New project.
- In the Project name text box, enter a name for the project and then click Create project.
- On the What kind of Action do you want to build? page, click Smart home > Start building. The project opens in the Actions console.
- Click Develop > Invocation.
- In the Display name text box, enter a name for the Action and then click Save. This name appears in the Google Home app later when there's a device to set up. For this codelab, we entered
WebRTC Codelab
as the display name, but you can use a different name.
- Click Actions.
- In the Fulfillment URL text box, enter a placeholder URL, such as
https://example.com
.
Run the CameraStream client app
The source code for this codelab includes a WebRTC client that establishes, negotiates, and manages the WebRTC session between the webcam and Google smart home display device.
To run the CameraStream WebRTC client app, do one of the following:
- Click the following button to download the source code to your development machine:
- Clone this GitHub repository:
$ git clone https://github.com/google-home/smarthome-camerastream-webrtc.git
The code contains the following directories:
- The
camerastream-start
directory, which contains the starter code upon which you build. - The
camerastream-done
directory, which contains the solution code for the finished codelab.
The camerastream-start
directory contains the following subdirectories:
- The
public
subdirectory, which contains a frontend UI to easily control and monitor the state of your camera device. - The
functions
subdirectory, which contains a fully implemented cloud service that manages the camera with Cloud Functions for Firebase and Realtime Database.
The starter code contains TODO
comments that indicate where you need to add or change code, such as the following example:
// TODO: Implement full SYNC response.
Connect to Firebase
- Navigate to the
camerastream-start
directory and then set up the Firebase CLI with your Actions project:
$ cd camerastream-start $ firebase use PROJECT_ID
- In the
camerastream-start
directory, navigate to thefunctions
folder and then install all the necessary dependencies:
$ cd functions $ npm install
- If you see the following message, ignore it. This warning is due to older dependencies. For more information, see this GitHub issue.
found 5 high severity vulnerabilities run `npm audit fix` to fix them, or `npm audit` for details
- Initialize a Firebase project:
$ firebase init
- Select Functions and Hosting. This initializes the necessary APIs and features for your project.
? Which Firebase CLI features do you want to set up for this folder? Press Space to select features, then Enter to confirm your choices. ❯◯ Realtime Database: Configure a security rules file for Realtime Database and (optionally) provision default instance ◯ Firestore: Deploy rules and create indexes for Firestore ◉ Functions: Configure a Cloud Functions directory and its files ◉ Hosting: Configure files for Firebase Hosting and (optionally) set up GitHub Action deploys ◯ Hosting: Set up GitHub Action deploys ◯ Storage: Configure a security rules file for Cloud Storage ◯ Extensions: Set up an empty Extensions manifest
- Configure Cloud Functions with the default files, and ensure that you don't overwrite the existing
index.js
andpackage.json
files in the project sample:
? Would you like to initialize a new codebase, or overwrite an existing one? Overwrite ? What language would you like to use to write Cloud Functions? JavaScript ? File functions/package.json already exists. Overwrite? No ? File functions/index.js already exists. Overwrite? No ? Do you want to install dependencies with npm now? Yes
- Configure Hosting with the
public
directory in the project code and use the existingindex.html
file:
? What do you want to use as your public directory? public ? Configure as a single-page app (rewrite all urls to /index.html)? Yes ? Set up automatic builds and deploys with GitHub? No ? File public/index.html already exists. Overwrite? No
3. Exchange Session Description Protocol (SDP) Messages
The exchange of SDP messages is an important step in the establishment of a WebRTC stream. SDP is a text-based protocol that describes the characteristics of a multimedia session. It's used in WebRTC to negotiate the parameters of a peer-to-peer connection, such as the codecs used, the IP addresses of the participants, and the ports used for media transport.
To use Realtime Database as a host to exchange SDP messages between your webcam and the smart home CameraStream client app, follow these steps:
- In the Firebase console, click Build > Realtime Database > Create database.
- In the Realtime Database location drop-down menu, select an appropriate location from which to host your database.
- Select Start in test mode and then click Enable. With Realtime Database enabled, you need the ability to reference it from the CameraStream client app.
- In the Firebase console, select Project settings > Project settings > Add Firebase to your Web App to launch the setup workflow.
- If you already added an app to your Firebase project, click Add app to display the platform options.
- Enter a nickname for the app, such as
My web app
, and then click Register app. - In the Add Firebase SDK section, select Use <script> tag.
- Copy the values from the
firebasebaseConfig
object and then paste them into thecamaerastream-start/public/webrtc_generator.js
file.
const firebaseConfig = {
apiKey: "XXXXX",
authDomain: "XXXXX",
projectId: "XXXXX",
storageBucket: "XXXXX",
messagingSenderId: "XXXXX",
appId: "XXXXX",
measurementId: "XXXXX"
};
- Click Continue to console to complete the process. You see the newly created web app on the Project settings page.
4. Create a WebRTC camera
Now that you configured your Action, your cloud service needs to handle the following intents:
- A
SYNC
intent that occurs when Assistant wants to know what devices the user has connected. This is sent to your service when the user links an account. You should respond with a JSON payload of the user's devices and their capabilities. - An
EXECUTE/QUERY
intent that occurs when Assistant wants to control a device on a user's behalf. You should respond with a JSON payload with the execution status of each requested device.
In this section, you update the functions that you previously deployed to handle these intents.
Update the SYNC
response
- Navigate to the
functions/index.js
file. It contains the code to respond to requests from Assistant. - Edit the
SYNC
intent to return the device's metadata and capabilities:
index.js
app.onSync((body) => {
return {
requestId: body.requestId,
payload: {
agentUserId: USER_ID,
devices: [{
id: 'camera',
type: 'action.devices.types.CAMERA',
traits: [
'action.devices.traits.OnOff',
'action.devices.traits.CameraStream',
],
name: {
defaultNames: ['My WebRTC Camera],
name: 'Camera',
nicknames: ['Camera'],
},
deviceInfo: {
manufacturer: 'Acme Co',
model: 'acme-camera',
hwVersion: '1.0',
swVersion: '1.0.1',
},
willReportState: false,
attributes: {
cameraStreamSupportedProtocols:['webrtc'],
cameraStreamNeedAuthToken: true,
cameraStreamSupportsPreview: true
},
}],
},
};
});
Handle the EXECUTE
intent
The EXECUTE
intent handles commands to update device state. The response returns the status of each command—for example, SUCCESS
, ERROR
, or PENDING
—and the new device state.
- To handle an
EXECUTE
intent, edit theEXECUTE
intent to return thesignaling
endpoint of the Firebase project in thefunctions/index.js
file:
index.js
app.onExecute(async (body,headers) => {
var array = headers.authorization.split(' ');
var snapshot = await firebaseRef.ref('/userId/'+array[1]).once('value');
var offerGenLocation = snapshot.val().type;
const {requestId} = body;
var result = {
status: 'SUCCESS',
states: {
cameraStreamProtocol: 'webrtc',
cameraStreamSignalingUrl:'https://us-central1-<project-id>.cloudfunctions.net/signaling?token='+array[1], // TODO: Add Firebase hosting URL
cameraStreamIceServers: '',
cameraStreamOffer:'',
cameraStreamAuthToken:'',
},
ids: [
'camera'
],
};
return {
requestId: requestId,
payload: {
commands: [result],
},
};
Handle Cross-origin resource sharing (CORS)
- To handle CORS due to the use of the
POST
method to send the SDP, add the Firebase Hosting URL to theallowlist
array in thefunctions/index.js
file:
index.js
'use strict';
const functions = require('firebase-functions');
const {smarthome} = require('actions-on-google');
const {google} = require('googleapis');
const util = require('util');
const admin = require('firebase-admin');
var allowList = ['https:www.gstatic.com','https://<project-id>.web.app']; //TODO Add Firebase hosting URL.
For more information about CORS, see Cross-Origin Resource Sharing (CORS).
Handle Stream Termination
- To handle WebRTC stream termination, add the Firebase ‘signaling' function URL to the
public/webrtc_generator.js
file:
webrtc_generator.js
terminateButton.onclick = function(){
console.log('Terminating Stream!!')
var signalingURL = 'https://us-central1-<project-id>.cloudfunctions.net/signaling'; //TODO Add Firebase hosting URL
var http = new XMLHttpRequest();
Deploy to Firebase
- To deploy to Firebase, deploy the updated cloud fulfillment with the Firebase CLI:
$ firebase deploy
This command deploys a web app and several Cloud Functions for Firebase:
... ✔ Deploy complete! Project Console: https://console.firebase.google.com/project/<project-id>/overview Hosting URL: https://<project-id>.web.app
Enable account linking
To enable account linking after your project is deployed, follow these steps:
- In the Actions console, select Develop > Account linking.
- In the OAuth client information section, enter the following information in the corresponding text boxes:
Client ID |
|
Client secret |
|
Authorization URL |
|
Token URL |
|
- Click Save > Test.
5. Test the virtual WebRTC camera
- Navigate to the Hosting URL that you saw when you deployed your Firebase project. You see the following interface, which is the CameraStream client app:
- In the Local Video Resolution panel, select the desired video.
- Grant permission to the CameraStream client app to access your webcam and microphone. A video feed from your webcam appears on the client.
Link to the smart home CameraStream
Action
- In the Google Home app, tap Add > Works with Google.
- Search for the Action that you created and then select it.
- Note the unique, five-character, alphanumeric code because you need it later.
- Tap Take me back. The WebRTC camera is added to your structure in the Google Home app.
Start a WebRTC stream
- On the web page for the CameraStream client app, enter the alphanumeric code from the last section in the Account linking token value text box and then click Submit.
- To start a WebRTC session from your Google smart display device, do one of the following:
- Say "Hey Google, stream WebRTC Camera."
- On your Google smart display device, tap Home control > Camera > WebRTC camera.
From the Google smart home CameraStream client app, you see that the Offer SPD and Answer SDP successfully generated and exchanged. The image from your webcam is streamed to your Google smart display device with WebRTC.
6. Congratulations
Congratulations! You learned how to stream from your webcam to a Google Nest display device with the WebRTC protocol.