I asked MLC LLM chatbot “what is Tom’s Hardware” and got a very accurate answer, describing our website and the different types of content we create. While most people would say that James Monroe is the correct answer, because he was the fifth person to be president, if you count presidential terms, Jefferson’s second term is the fifth overall. was, it gave an atypical but truthful response, naming Thomas Jefferson. When I asked the bot to tell me who the fifth president of the U.S. It also was incapable of taking follow-up questions as it treated each prompt as a completely new conversation. It had the ability to write poetry, but did an awful job. It gave accurate answers to some factual questions but made up a fictional biography for me. The quality of answers I got from the LLM was nothing to type home about. (Image credit: Tom's Hardware screenshot) I assume that this would go faster on a more powerful device. Like ChatGPT, it types the answer while you watch so it can take a minute or two to see a complete response. On my laptop, the bot was really slow to respond, taking close to 30 seconds to begin entering a response to any query. Then it greets you and asks how it can help you and you can ask it questions. When you launch MLC LLM’s chatbot, it first asks you for your name. To launch it, I had to activate the mlc-chat conda environment and enter the command mlc_chat_cli. The chatbot runs in a command prompt window. The Vicuna-7B-V1.1 model took up just 5.7GB of storage space and the rest of the project uses up an additional 350MB or so. Then I used the set of instructions on mlc.ai to create an environment called mlc-chat and download the language model into it. With Conda, you can create separate environments that have their own set of Python packages that don’t conflict with the other packages on your system.Īfter installing Miniconda, I launched the Anaconda Prompt (a version of the command prompt that runs Conda). To set up MLC LLM, I first had to install Miniconda for Windows, which is a light version of the popular Conda package manager (you can use the full Anaconda version). This is a five-year-old laptop with integrated graphics and no VRAM. I had no problem installing and running MLC LLM on my ThinkPad X1 Carbon (Gen 6) laptop, which runs Windows 11 on a Core i7-8550U CPU and an Intel UHD 620 GPU. But when he asked what was PLUGHITZ Live, a tech podcast company, he got a very odd answer saying that it's an electronic music concert series run by "DJ Plug." When he asked it about Tom's Hardware, it got a reasonable answer about what we do. When he asked it to choose the best processor for gaming, it gave a very vague, non-committal answer where it didn't mention any specific models and just said to go for more cores and higher clock speeds. He asked the MLC LLM a few questions and the responses were mixed. He then tested with an iPhone 12 Pro Max, which also has 6GB of RAM, and found that it also worked. However, he said that the app dominated the phone, using all of its resources and slowing other apps down. He had to try a couple of times to get the install to work, but once installed, the app itself worked without crashing. Later I asked my friend, Scott Ertz of PLUGHITZ Live, to try installing MLC LLM on his iPhone 14 Pro Max, which is more powerful than the iPhone 11 and has 6GB of RAM instead of 4GB. However, on launch, the app crashed after showing the message “ Initialize…” every time he ran it. Freedman installed the MLC LLM test app on his iPhone 11 Pro Max, a 3GB download. It's supposed to work on any iPhone, iPad or iPod Touch that runs iOS 13 or higher, but in our experience, it requires one of the more powerful Apple devices with plenty of RAM. You can also compile it yourself from the source code. While anyone can install the PC versions, the iOS version requires you to use TestFlight, Apple’s developer system, on your device and there’s a limit of 9,000 iOS users who can install the test app at one time. Installing and Running MLC LLM on an iPhone Your data stays local so your privacy is intact, you don’t need Internet access to use it and you might have more control over the output. There are many reasons why having a local chatbot would be preferable to using a cloud-hosted solution such as ChatGPT.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |