The strongest open source LLM model Llama3 has been released, Here is how you can run Llama3 70B locally with just 4GB GPU, even on Macbook

Read more here: External Link