BeagleBoard is akin to a Raspberry Pi. It's not meant to be a powerhouse PC, or in this case, a powerhouse AI computing platform.
It's meant for embedding, tinkering, and learning.
To that end, 4GB of RAM on an AI Accelerator board is fine - the expected workloads will not consume a lot of RAM. This also makes the lack of NVMe sufficient as well.
For more "horsepower" there is also the BeagleBone AI-64[1], which claims up to 8 TOPS.
This looks interesting, but the trouble with these small built in accelerators is that they were mostly designed when YOLO was the pinnacle of edge applications. These days they’re grossly underpowered…
I find it interesting that the single-board computer market seems to be coalescing around the Raspberry Pi B models as the standard form factor. This device in particular has almost all of the same IO connectors as the Raspberry Pi 5 (one microHDMI port is missing) in all the same places, so it should be compatible with Raspberry Pi 5 cases. I wonder whether the pinout on the 40-pin header is the same as that on the Pi?
I think most of us here are familiar with the fact that amd64 machines are made entirely from commodity parts: ATX cases, ATX power supplies, and so on. I wonder whether there's a similar commodification in the offing for the Pi form factor?
4 TOPS-capable
So, reach out & it'll be there?
Jetson Orin Nano has no video hardware encoder, probably no hardware decoder either, you need use CPU to do the heavy computing in 2024.
Not sure about BeagleY-AI here, since it has DSP inside, probably also doing software encoding, using DSP instead of CPU though.
For power efficiency, I would think the cheap hardware encoder is the way to go, surprised both are using software as the encoders.
I don't know anything about the Beagle. How is it different than say a Raspberry pi 5? It says open source hardware but what does that actually mean. I'm curious about buying one but I want to understand them better first
Other boards have NPUs with 4 tops too plus more mem. Eg the orange pis.
… but doesn’t look like any of the usual LLMs suspects can run on NPU so not sure it’s much use. I’ve seen some opencv code but that’s about it
4GB of RAM seems low. No NVME even though it has PCIe :(
Beware of the documentation though, which is literally a blank page "coming soon!":
I think we need some sort of mlbench but for hardware ranking... I have no idea what these devices are capable of and neither do the vendors apparently.
I can't wait for it to be sold out, or only being sold at shops that wan't like 40$ for international shipping...
Is the low level firmware open source?
I know it isn’t on the rpi, which runs a proprietary broadcom version of microsoft threadx.
With the two DSPs, seems like this would be a nice board to build a guitar effects pedal around.
Meanwhile, in Taipei...
https://www.extremetech.com/computing/intel-40-tops-is-the-n...
4GB RAM is not much.
AI accelerator? Really? I thought AI acceleration in this day and age meant more than multiplying two matrices together. If not, what's CUDA all about?
I honestly thought this thing was going to run TOPS-10 or TOPS-20.
I'm a bit disappointed.
I wish there was some reference of what can I actually do with 4 or 8 TOPS and 4 or 8 or whatever GiB of accelerator memory. Can I run speech recognition? Run a model for object detection in images? In video? LLMs - probably not. Stable Diffusion also seems out of question. But really - what can I run?