Getting Started with MyAI: Running Local AI Models on Windows with WSL

A friendly guide to setting up and running local AI models using MyAI and WSL on your Windows machine

If you’ve ever been curious about running AI models right on your own computer instead of relying on cloud services, you’re in the right place. Local AI models are becoming more accessible thanks to tools like MyAI, which make it pretty straightforward to set up and experiment with AI directly on your Windows machine using WSL (Windows Subsystem for Linux).

What Is MyAI and Why Use It?

MyAI is essentially a wrapper around vLLM and designed to work on Windows with WSL. If you aren’t familiar, WSL lets you run a Linux environment right on Windows, which is a handy way to tap into powerful open-source tools that usually run on Linux. This is perfect for folks who want to explore AI without jumping into complicated installations or cloud subscriptions.

MyAI simplifies the process of downloading and running a local AI model through a script that handles everything—from setup to launching the model. It grabs models from popular repositories like Hugging Face, which is a big hub for open-source AI models and datasets. What’s neat is this script is kind of a one-click installation and launch; it even supports different operation modes. You can run it in client-only mode, where your machine acts like a local AI client, or client-server mode, which lets you interact with the model right from your computer.

Who Is This For?

If you already have Ubuntu 24.04 installed under WSL, this tool might not be for you since it’s more about getting folks started with a fresh WSL setup. But if you’re new to WSL or AI models on your PC, MyAI can significantly smooth out the learning curve. It’s great for enthusiasts with a decent GPU; for example, someone with about 12 GB of VRAM can actually run a reasonably capable AI model locally. This gives hobbyists and curious learners a chance to play with AI without renting servers.

How Does It Work?

The script covers multiple command environments—CMD, PowerShell, C#, and Bash—which means it’s quite flexible. During setup, options like which AI model to download are set at the top of the script. Currently, it defaults to a specific variant of the LLaMA model selected based on your GPU VRAM, but you can manually enter a repository link if you want something different.

The creator is working on a simple user interface to make selection even easier down the road, but for now, it’s all about simplicity and getting the model running quickly. While the current models don’t yet integrate tool usage, that’s an anticipated feature as the project evolves.

Why Run AI Models Locally?

Running local AI models has some clear benefits:

  • Privacy: Your data stays on your device.
  • Offline Access: No internet needed once set up.
  • Experimentation: You can tweak, test, and learn without limitations imposed by cloud services.

It’s exciting to see the gap between commercial AI cloud providers and what home users can do narrow, thanks to projects like MyAI and vLLM.

Ready to Try?

If you want to dip your toes in, first ensure you have Windows with WSL set up. Installing WSL on Windows 10 or 11 is pretty straightforward—you can follow the official Microsoft guide here: Install WSL.

Then, you can check out the MyAI project and grab the script from its GitHub repo: MyAI GitHub. Keep in mind this is still a starter script, so some troubleshooting might be needed, but it’s a fantastic way to start your local AI journey.

Learn More

If you’re curious about vLLM itself, which is the underlying library powering this, here is the official vLLM page to get you familiar: vLLM Official Site.

Diving into local AI models might seem intimidating at first, but with tools like MyAI, that entry barrier is getting lower every day. Give it a shot—you might be surprised what your own computer can do!