Running Ollama locally on Android device
The Future is Local, and it’s Mobile
The Future is Local, and it’s Mobile
This blog post is to demonstrate how easy and accessible RAG capabilities are when we leverage the strengths of AnythingLLM and Ollama to enable Retrieval-Au...
One of the perks of my job is having access to hardware and software with which I can play and experiment. To truly understand the hype behind AI, I needed t...
I recently stumbled across a book called “Building a second brain” by Tiago Forte. The book talks about the use of PKM (Personal Knowledge management) as a S...
During the recent Cisco Live Melbourne 2023. I was involved in the Cisco NOC (Network Operations) teams. We were tasked to bring our own (bump in) Cisco Netw...