I just ordered an Nvidia Jetson Nano — if you don’t know what this is, here’s why I think it’s so darn cool
Jetson Nano order placed, it arrives the first week in January, and I’ll be honest, I haven’t been this excited about a piece of hardware in a very long time. If you don’t know what this beautiful device is, hopefully by the end of this article you’ll be equally excited. So let’s break it down.
What is the Jetson Nano?
There’s no way I’m going to steal Nvidia’s thunder here in explaining this, so here it is directly from the source:
The Jetson Nano module is a small AI computer that gives you the performance and power efficiency to take on modern AI workloads, run multiple neural networks in parallel, and process data from several high-resolution sensors simultaneously. This makes it the perfect entry-level option to add advanced AI to embedded products. (Source — Nvidia)
Now let’s go deeper.
The Jetson Nano represents a shift in hardware technology, especially for people like myself that want to build AI solutions myself vs. renting hardware from someone else.
In the AI world today there are two paths that are developing. You either rent AI horsepower from a company like Google or Amazon, or you get under-the-hood and run it yourself. Right now I’d say the easiest way to dive in and power AI yourself at home is on your current computer using Llama from Meta.
With Llama you can download super powerful AI models directly onto your computer, fine tune, and train away without paying a dime. But here’s the rub, your computer, even if it is a sleek new MacBook is kinda big, and heavy, and probably overpowered for a lot of the things you’d want to do.
When I was at CMU I took an embedded systems class, it blew my mind and also inspired me to take on some of the fun programming challenges that existed when you’re trying to run code on smaller devices.
It shouldn’t be a shocker to anyone that your $2,000 MacBook can run an AI model like Llama, but what if you wanted something say one tenth the size…that’s where things get tricky.
And this is where NVidia is now playing with the Jetson Nano.
Yes, you read that right, the Jetson Nano module is smaller than a credit card, and yes, you can run models on it, which means that you can now embed AI into smaller and smaller things. And that, to me, and many others, is insanely fascinating, so fascinating, I couldn’t help myself, I had to have one.
And I’m late to the party as a lot of people are already doing insanely cool things with Jetson devices.
As for what I’m going to use mine for, I’m still noodling on ideas, but you can bet I’ll be sharing on Medium once I do. For now, since it’s going to be a couple of weeks until my Jetson Nano arrives, I’ll be reading docs and looking at what other people have done for a little more inspiration.
Oh and if you think you have to be some kind of expert developer to jump in and use this, think again. Honestly, I think the Jetson Nano is probably a great way for people who haven’t written code before to learn to code. There’s nothing better than learning to code with a real life example, it’s so much more fun than a course or book. So dive in, get one, play around with it, heck, here’s a starting point:
Okay, that’s it for now, I might share another article about the Jetson Nano before it arrives if this article gets enough interest. So please give some claps and share if you like this and want me to write more about my journey with the Jetson Nano!