A JavaScript SDK for Bose AR-enabled products, including the Bose Frames, Bose QuietComfort 35 Wireless II, and Bose 700 Wireless . Click here for a Live Demo!
π Enabling and Disabling Sensors
π Enabling and Disabling Gestures
-
Make sure you have a Web Bluetooth-enabled device
- Chrome for Desktop [PREFERRED]: enable Web Bluetooth by going to
chrome://flags/#enable-experimental-web-platform-features
and checkExperimental Web Platform features
- iOS: Use this app to demo your web apps. Unfortunately iOS is very negligent on various Web API's.
- Chrome for Desktop [PREFERRED]: enable Web Bluetooth by going to
-
Update your Bose AR-enabled headset's firmware on their website
-
Disconnect your Bose AR-enabled from your smartphone if you have the Bose Connect App installed.
-
Save a local copy of
bose-ar-web-sdk.min.js
-
In your HTML
<head></head>
element, insert the file in a script element:
<script src="bose-ar-web-sdk.min.js"></script>
- In your HTML
<body></body>
element, insert the following custom element:
<bose-ar-device></bose-ar-device>
This element represents your Bose AR-enabled device, and will be used to interface with it. On your website the element will display a πΆοΈ
button that will attempt to connect with your Bose AR-enabled device when clicked, and become hidden once connected.
NOTE: Bose AR devices have 2 types of Bluetooth Connections: an Audio connection (audio playback and microphone controls) and an AR connection (head-tracking and gestures). This SDK is used to establish an AR connection and will allow you to enable/disable sensors/gestures, as well as access the sensor/gesture data. If you want to access the Bose AR device as an audio device (like normal bluetooth headphones and headsets), you'll need to connect to it again "the usual way" and use the standard Web Audio API.
ALSO NOTE: You can connect to multiple Bose AR devices (as an AR device) simultaneously, as well as have a single Bose AR device be accessed from multiple browsers (e.g. your smartphone and laptop). However, disabling a Bose AR's sensor/gesture from one browser disables it for any browser observing the Bose AR device's sensors/gestures.
ALSO ALSO NOTE: The browser needs permission from the user to connect to the Bose AR device (as an AR device as mentioned in the first "NOTE"), but no action on the Bose AR device is needed (e.g. pressing a "connect" button on the Bose AR device). This means anyone can access an unassuming user's Bose AR device from their browser and enable/disable/read sensor/gesture data without the Bose AR device owner even knowing. I'm not sure why they designed them this way, but I don't work there.
π To enable sensors before runtime, add them as attributes in the custom element, with an attribute value indicating the refresh rate (in milliseconds or as a string):
<bose-ar-device gyroscope=20 rotation="fast"></bose-ar-device>
π To enable sensors during runtime, set the custom element's attribute:
document.querySelector("bose-ar-device").setAttribute("rotation", "fast");
π To disable sensors during runtime, remove the custom element's attribute:
document.querySelector("bose-ar-device").removeAttribute("rotation");
π Valid sensor attributes:
accelerometer
gyroscope
rotation
game-rotation
β²οΈ Valid sensor attribute values:
20
or"very-fast"
40
or "fast"
80
or "normal"
160
or"slow"
320
or"very-slow"
π To enable gestures before runtime, add them as attributes in the custom element:
<bose-ar-device double-tap head-nod head-shake></bose-ar-device>
π To enable gestures during runtime, set the custom element's attribute:
document.querySelector("bose-ar-device").setAttribute("double-tap", '');
π To disable gestures during runtime, remove the custom element's attribute:
document.querySelector("bose-ar-device").removeAttribute("double-tap");
π Valid gesture attributes:
(coming soon)single-tap
double-tap
head-nod
head-shake
- To listen for sensor or gesture events, add an
eventListener
to the custom element:
document.querySelector("bose-ar-device").addEventListener("accelerometer", yourCustomCallback);
π Valid event names:
-
"accelerometer"
-
"gyroscope"
-
"rotation"
-
"gameRotation"
-
(coming soon)"singleTap"
-
doubleTap
-
"headNod"
-
"headShake"
-
To get the event data, you can get it from the custom element's attributes:
document.querySelector("bose-ar-device").addEventListener("accelerometer", event => {
const rotationX = Number(document.querySelector("bose-ar-device").getAttribute("rotationX"));
});
π Valid attributes for events:
"accelerometer"
"accelerometerX"
"accelerometerY"
"accelerometerZ"
"accelerometerTimestamp"
"gyroscope"
"gyroscopeX"
"gyroscopeY"
"gyroscopeZ"
"gyroscopeTimestamp"
"rotation"
"rotationW"
"rotationX"
"rotationY"
"rotationZ"
"rotationYaw"
"rotationPitch"
"rotationRoll"
"rotationTimestamp"
"gameRotation"
"gameRotationW"
"gameRotationX"
"gameRotationY"
"gameRotationZ"
"gameRotationYaw"
"gameRotationPitch"
"gameRotationRoll"
"gameRotationTimestamp"
(coming soon)"singleTap"
"singleTapTimestamp"
"headNod"
"headNodTimestamp"
"headShake"
"headShakeTimestamp"
Send us an email at zack@ukaton.com if you have a cool application made with our sdk!
- Download the extension folder
- Add your custom code in a Promise returned in
injection.js
by thewindow.boseARDeviceElement.connect();
method - Customize the extension interface by adding buttons and sliders and stuff to
popup.html
- Add eventListeners to the interface elements in
popup.js
, usingsendMessage(message)
to forward the event to the current website you're on. This message should include acase
property value to specify the purpose of the message.
myButton.addEventListener("click", event => {
const myMessage = {
case : "myCase",
};
sendMessage(myMessage);
});
- Add your
case
string to theswitch(event.data.case){}
block ininjection.js
, which will receive themessage
object created in the previous step. Here you can define your custom behavior, using both the message case and any extra values you passed in themessage
object. - Load your extension into Chrome by going to
chrome://extensions/
, clickingLoad unpacked
, and selecting your editedextension
folder. - You can change the name of the extension by going to
manifest.json
and changing thename
property. - You can change the icon by replacing
icon.png
with your own image.
Prefer Cycling '74's Max? Now you can use as many Bose AR devices as you want in a Max Patch using Node for Max via WebSockets!
- Download the
bose-for-max
folder - Open
bose-for-max.maxpat
in Max (Max 8 is required for Node for Max. - Go to
localhost:3000
to connect your BoseAR device via the WebSDK (the top of the webpage will display the socket connection index to distinguish multiple devices in the Patch) - Once connected, you can enable/disable sensors/gestures either in the Patch or the webpage
- For multiple devices, copy-and-paste the BoseAR device region under
Bose AR Device #
, and change the # to indicate other devices via their socket connection index (shown at the top of the web page)
Our time is limited, so we'd greatly appreciate it if you guys could implement some of these ideas:
- Social Area Network πΊοΈπ« - Place voice recordings on a map for others to hear (or only for yourself as a location-triggered notes app or to-do list).
- Where You At? π€· - Call a friend (or a group of friends) and know where they are by listening to what direction their voice is coming from, using WebRTC (or a WebRTC wrapper like Simple-Peer to stream both Voice and Location Data, as well as using the Resonance Audio SDK for sound spatialization.
- Yelp Radio π₯‘ππ¬ π€€ - Hear Yelp reviews as you pass by restaurants, using your location and Yelp's APIs to convert written reviews to speech using the Web Speech API.
- Twitter Extension π¨οΈπ¦ - Go on Twitter and convert tweets to speech, listening to your feed in the background. You can even nod to "like" a tweet or double-tap to comment.
- Spotify Spots π΅πΊοΈ - Use the Spotify Web API and Location Data to create playlists for frequent places and paths! You can curate your routine and play certain songs when you're at the gym, on the road, or at work.
- Waze Gaze ππ - Use the Waze API and the Resonance Audio SDK to alert users with sonified and spatialized notifications (e.g. play a police siren spatialized to their relative geolocation), and allow drivers share their own traffic alerts by saying what they see instead of looking down at the screen. (Additionaly, the gyroscope sensor can be used to detect speed bumps, and the rotation sensor can be used to know if the driver is looking in the direction they're driving in to prevent prolonged distraction). Plus, the Spotify Web API can be used to localize music/podcasts/audiobooks based on where the driver needs to go (e.g. panning the audio left or right if they need to make a turn soon).