With the release of llama 3 from Meta yesterday afternoon the local llm capabilities have gotten more powerful!

Gary Thompson @gwthompson