Misty API Explorer

Easy Commands


Asset Management


To add an image or audio asset to the robot, first drag-and-drop the file to the appropriate dropbox or click on the box to browse for the file. Once the file has been successfully added, the file name will appear below the dropbox with a check box to the left. Click the checkbox to select the file, then click the Save to Robot button. To delete a file from the robot, first populate the list of files by clicking the Populate Audio/Image List button. Select the file from the dropdown list. Then click the Delete Clip/Image button. The maximum size for files is 3MB. Accepted audio file extensions are: .wav, .mp3, .wma, .aac. Accepted image file extensions are: .jpeg, .jpg, .png, .gif.

Locomotion


Websockets


Websockets provide up-to-date information on specified objects. Right now you may subscribe to the objects as shown below. The Sensor Reading Websockets are an example showing Time of Flight and Battery Charge messages, where the values get streamed to the text box next to them.

To see the websocket responses, you must view them through the browser console.

The generic subscribe example allows you to pick from our growing list of options to subscribe to. If you only select a named object, the Event Name for that object will be created as the Named Object name and that is the name you must use to unsubscribe. If you do not enter in a debounce, it defaults to 250 ms (the messages will be sent (at most) at millisecond values of the debounce). Too many socket subscriptions at a fast debounce can cause performance issues, so remember to unsubscribe when you don't need data and to set the debounce as high as is appropriate for your needs.

To filter to specific details in subscription, you can put the data property path in the ReturnProperty field, which will cause it to return that data. The data property path is specified from the Named Object and currently must be discovered by examining the data packet. For example, if you want Mental State (which you can't subscribe to) which is an object in SelfState, you can put MentalState in the ReturnProperty field. If you want the specific Valence value of the Affect in Mental State, your ReturnProperty will be MentalState.Affect.Valence

You may also use the same pattern to filter on data. In this case, the data will only be sent if the filter is true. For example, if I only want to return the above Mental State's Affect data if the Dominance value in Affect is equal to 1, I would have the following settings.

NamedObject : Self State
Property: MentalState.Affect.Dominance
Comparison: ==
Value: 1
ReturnProperty: MentalState.Affect

The currently allowed comparison options are: ==, !=, <, >=, >, <=, empty and exists

Websockets are needed for most of the following commands. API Explorer automatically opens a websocket when you enter an IP address and connect to the robot (at the top of the page).

Sensor Reading Websockets
Other Websockets

Beta Commands


Computer Vision

Enter a name in the text field and start training. Hold your position in front of the camera, about a foot or two away, for 10 seconds. Only select Cancel Face Training if you want to cancel during that period. The camera will take a series of pictures of your face and attempt to create a face matrix so it can recognize you in the future.

Currently, the face detection and recognition data comes directly from the sensors so some of the data may be incomplete. We are currently working on improvements to aggregate this data.

Head Commands

The head movements are still very dependant upon each robot at this time. If your robot's head has not been calibrated or is front heavy, it may not move.

  • Back (Up)
  • Forward (Down)

Alpha


Mapping and Exploring

Following is a "quick start" version of the instructions for mapping and exploring. For a more detailed explanation of these functions, visit our documentation here.

Select start mapping. After a few seconds you should obtain pose. Once pose is obtained, the pose circle will turn green and the pose updates will start streaming. If you never get pose, select Stop Mapping and then Reset. Check the status with Get Status, and if valid, try again. If you do not see pose updates, it is also possible the lighting is too low for the robot. If it does not return ready after multiple Reset and Get Status calls, you may not have turned on the 820 or Slam may be in a bad state requiring a robot reboot.

Select one of the drive options (Turn in Circle, etc), or drive the robot around for a fuller map. When you are done with mapping, select Stop Mapping and then Get Map to retrieve the map from the robot. When creating a map or tracking, be sure to adjust the velocity so the robot moves slowly and keeps its pose.

To drive to a location, figure out the X, Y coordinates X is Up (forward), Y is across) and then select Start Tracking. You should start to see pose updating again. You can now enter in the waypoints where you want the robot to drive and select FollowPath (wayponts should be entered in the form of: X1:Y1,X2:Y2,X3:Y3)

When you are done driving, you can select stop tracking in order to release resources on the robot.

Mapping

Pose Pose

Read coordinates from the bottom right corner.

When determining waypoints, X is the direction the robot is looking at the start of mapping and is read from the bottom of the map to the top of the map.

Y is read from right to left with zero being the right side of the map.

Pixels per grid is a value from 1 to 20 that indicates the number of pixels per grid cell on the rendered map. The higher the number, the larger the map.

Sorry, but your browser does not support the HTML5 canvas tag.
Open
Occupied
Covered
Unknown
Tracking

Pose Pose
Follow Path

System Updates


For more information, please refer to the documentation at https://docs.mistyrobotics.com.