To clarify on micro:bit, I believe there are three options:

  1. The model runs on the host computer and passes data to/from the micro:bit via Bluetooth (e.g. ML-Machine, PlushPal)
  2. An external camera based on the Kendryte K210 has pre-trained models (e.g. HUSKEYLENS, KittenBot KOI, etc.) and usually connects to the micro:bit via cables
  3. Creating a model in Edge Impulse and downloading it to the micro:bit V2. This can use the IMU/gestures or the microphone/sounds. Conceivably it could do image classification with the Arducam mini, but I have not used it before. Take a look at this for more information: https://www.hackster.io/news/arducam-mini-is-the-first-to-add-machine-vision-to-the-bbc-micro-bit-2755aed686be 

I hope this helps, Hal


On Friday, September 22, 2023 at 01:53:20 AM CDT, Dave Touretzky <xxxxxx@cs.cmu.edu> wrote:


>  As I understand the micro-bit AI stuff, all the AI is happening on the
>  laptop and micro-bit is just reacting to what the laptop reports. While
>  Cozmo and the VET robot David talked about I think has enough hardware
>  on the robot to work without an additional device. Is that right?

Almost.  Cozmo does most of its processing via an app that runs on a
tablet or smartphone and communicates with the robot via WiFi.  This
keeps the robot's cost down and also gives it longer battery life.  This
was a smart move at the time, but when Anki developed their next robot,
Vector, they put the processing on board using a much more powerful chip
comparable to what one would find in a cellphone.  But Vector doesn't
have the same educational focus as Cozmo.

There are two key differences between an AI-powered robot like Cozmo
vs. a micro:bit attached to a laptop running an AI app like a Teachable
Machine classifier.  First, the robot has the camera on board, so as it
moves through the world, its view changes as a result of its own
actions.  The laptop can look at the world but it can't control what it
sees.

Second, robots can take action to change the state of their world, such
as by picking up objects and moving them around.  A laptop can use the
micro:bit to flash lights or operate a servo to wave something around,
but it would be difficult to achieve goal-based manipulation of objects.


-- Dave
To unsubscribe from this list please go to https://lists.aaai.org/confirm/?u=oOQnjejNvfKdQ9lRaLR0xZZwaUGo3Zwx