<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Blue Sky Projects]]></title><description><![CDATA[Thoughts, stories and ideas.]]></description><link>https://blog.studien.us/</link><generator>Ghost 5.80</generator><lastBuildDate>Fri, 08 May 2026 15:22:52 GMT</lastBuildDate><atom:link href="https://blog.studien.us/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Running an LLM locally on your own machine]]></title><description><![CDATA[<p>Gone are the days where you downloaded the raw model off of Huggingface, struggled installing all the dependencies, and wrote your own python script to set up everything. Ollama takes the guesswork out of running an LLM on your own machine!</p><p>Head on over to Ollama&apos;s website and</p>]]></description><link>https://blog.studien.us/running-an-llm-locally-on-your-own-machine/</link><guid isPermaLink="false">661197afdac5150001a17d9b</guid><dc:creator><![CDATA[Cameron Anderson]]></dc:creator><pubDate>Sat, 06 Apr 2024 19:05:07 GMT</pubDate><media:content url="https://blog.studien.us/content/images/2024/04/libraries.svg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.studien.us/content/images/2024/04/libraries.svg" alt="Running an LLM locally on your own machine"><p>Gone are the days where you downloaded the raw model off of Huggingface, struggled installing all the dependencies, and wrote your own python script to set up everything. Ollama takes the guesswork out of running an LLM on your own machine!</p><p>Head on over to Ollama&apos;s website and download the appropriate Ollama for your machine. <a href="https://ollama.com/download?ref=blog.studien.us">https://ollama.com/download</a><br><br>I&apos;m running on MacOS, so I downloaded the executable, and it even gave me a cute toolbar icon.<br></p><figure class="kg-card kg-image-card"><img src="https://blog.studien.us/content/images/2024/04/image.png" class="kg-image" alt="Running an LLM locally on your own machine" loading="lazy" width="1190" height="258" srcset="https://blog.studien.us/content/images/size/w600/2024/04/image.png 600w, https://blog.studien.us/content/images/size/w1000/2024/04/image.png 1000w, https://blog.studien.us/content/images/2024/04/image.png 1190w" sizes="(min-width: 720px) 720px"></figure><p>After installation, you can simply open a terminal and run the following command</p><p><code>ollama run llama2</code></p>
<p>If this is your first time running the llama2 model, it will download it automatically for you. Now llama2 is a fine model, but I&apos;ve found the censorship built into this model to be hampering it&apos;s true performance. Check out this query </p><figure class="kg-card kg-image-card"><img src="https://blog.studien.us/content/images/2024/04/image-1.png" class="kg-image" alt="Running an LLM locally on your own machine" loading="lazy" width="1074" height="202" srcset="https://blog.studien.us/content/images/size/w600/2024/04/image-1.png 600w, https://blog.studien.us/content/images/size/w1000/2024/04/image-1.png 1000w, https://blog.studien.us/content/images/2024/04/image-1.png 1074w" sizes="(min-width: 720px) 720px"></figure><p>Regardless of your thoughts on this subject, I think we can all agree that we can handle our self and understand what AI is capable of.<br><br>After a quick scan on the ollama website, I found there is an uncensored version of the llama2 model, and you can download and run it using</p><p><code>ollama run llama2-uncensored</code></p>
<p>Let&apos;s see what it prompts </p><figure class="kg-card kg-image-card"><img src="https://blog.studien.us/content/images/2024/04/image-2.png" class="kg-image" alt="Running an LLM locally on your own machine" loading="lazy" width="1048" height="288" srcset="https://blog.studien.us/content/images/size/w600/2024/04/image-2.png 600w, https://blog.studien.us/content/images/size/w1000/2024/04/image-2.png 1000w, https://blog.studien.us/content/images/2024/04/image-2.png 1048w" sizes="(min-width: 720px) 720px"></figure><p>Just like that you&apos;re running an LLM on your local computer. You don&apos;t even need to be connected to the internet for these models to work. Perfect for keeping up with the dystopian future of restricted LLMs.</p><p>Ollama exposes a rest api for you to use too, so this also covers your programmatic use cases! The world is your oyster!</p><pre><code>curl http://localhost:11434/api/generate -d &apos;{
  &quot;model&quot;: &quot;llama2&quot;,
  &quot;prompt&quot;: &quot;Why is the sky blue?&quot;
}&apos;
</code></pre>
<p>So definitely go and browse ollama&apos;s website to see what models are available. If your computer is strong enough, I definitely suggest checkout a 13B model like dolphin-mixtral<br><br></p>]]></content:encoded></item><item><title><![CDATA[Setting up a blog]]></title><description><![CDATA[<p>I&apos;ve wanted to create a blog for many years now. Originally I wanted to build this up from scratch, but I&apos;ve come to realize that I&apos;ve got limited time in this world. Therefore, I&apos;ve used Ghost which is an open source publication</p>]]></description><link>https://blog.studien.us/setting-up-a-blog/</link><guid isPermaLink="false">65e517c5b8c7c80001ccc3d8</guid><dc:creator><![CDATA[Cameron Anderson]]></dc:creator><pubDate>Mon, 04 Mar 2024 01:17:19 GMT</pubDate><media:content url="https://blog.studien.us/content/images/2024/03/ghost-1.png" medium="image"/><content:encoded><![CDATA[<img src="https://blog.studien.us/content/images/2024/03/ghost-1.png" alt="Setting up a blog"><p>I&apos;ve wanted to create a blog for many years now. Originally I wanted to build this up from scratch, but I&apos;ve come to realize that I&apos;ve got limited time in this world. Therefore, I&apos;ve used Ghost which is an open source publication software. It&apos;s like WordPress, but with less annoying stigma around it. Ghost provides the technical stack, someone provides the theme, and I provide the content. <br><br>To preface this, I hate long blog posts. It takes a truly disciplined mind to read every word of those long blog posts. You&apos;re probably here to take inspiration and find an easy tutorial to follow along with. That&apos;s what I&apos;ll focus on making.</p><p>So if you wanted to self host your own nice blog like this, it&apos;s super simple with ghost. I have a server that i run in my own house. For all intents and purposes, you can think of this server as a glorified docker container runner. I use docker orchestration software like Portainer to give me a UI to work with, but you can do all of this from your own house.</p><p>So in this post, we&apos;ll cover setting up DNS, Reverse Proxy with SSL Termination, and ultimately the actual ghost and associated db setup. We will take care of all of this with docker compose and a few quick config files. </p><h2 id="dns"><br>DNS</h2><hr><p>So let&apos;s start with DNS. I like to separate the services on my server with different subdomains. If you don&apos;t know what a subdomain is, keep reading. <br><br>Do you see the URL in the top of your browser right now?  It probably says something like https://<strong>blog</strong>.studien.us. My domain name that I bought is studien.us, and <strong>blog </strong>is the subdomain. I create a DNS record called a CNAME to connect that subdomain up with my A record. The A record connects studien.us with an IP address. The IP Address is where my server is on this big internet. <br><br>So yea this is small subsection of my DNS records for my studien.us domain. I won&apos;t bother hiding the other stuff because it&apos;s not really a secret and anyone with an ounce of technical knowledge can find this themself.</p><figure class="kg-card kg-image-card"><img src="https://blog.studien.us/content/images/2024/03/Screenshot-2024-03-03-at-7.51.10-PM.png" class="kg-image" alt="Setting up a blog" loading="lazy" width="1318" height="432" srcset="https://blog.studien.us/content/images/size/w600/2024/03/Screenshot-2024-03-03-at-7.51.10-PM.png 600w, https://blog.studien.us/content/images/size/w1000/2024/03/Screenshot-2024-03-03-at-7.51.10-PM.png 1000w, https://blog.studien.us/content/images/2024/03/Screenshot-2024-03-03-at-7.51.10-PM.png 1318w" sizes="(min-width: 720px) 720px"></figure><p></p><h2 id="reverse-proxy-with-ssl-termination">Reverse Proxy with SSL termination</h2><hr><p>I&apos;ve used Caddy is my reverse proxy of choice for a long time now, and it&apos;s been super easy to use and configure. I found it easy to pick up for my usecase compared to these other reverse proxy solutions out there. You&apos;re welcome to use another reverse proxy though.</p><p>So let&apos;s create a file called docker-compose.yml</p><pre><code class="language-yaml">version: &quot;3.7&quot;
services:
  caddy:
    image: caddy:latest
    container_name: caddy
    ports:
      - &quot;80:80&quot;
      - &quot;443:443&quot;
      - &quot;443:443/udp&quot;
    volumes:
      - &lt;YOUR_PATH_HERE&gt;/Caddyfile:/etc/caddy/Caddyfile
      - &lt;YOUR_PATH_HERE&gt;/data:/data
      - &lt;YOUR_PATH_HERE&gt;/config:/config</code></pre><p>Change &lt;YOUR PATH HERE&gt; to represent a directory on your local machine.</p><p>Alright great, this is a simple docker compose that does 3 things. </p><ol>
<li>Pulls the docker image</li>
<li>Connects ports from your machine to the inside of the docker container</li>
<li>Connects directories from your machine to the inside of the docker container.</li>
</ol>
<p>Go to the path you set above and create the Caddyfile. The other files will be created automatically. Once you&apos;ve created the Caddyfile, add the following:</p><pre><code>blog.studien.us {
        reverse_proxy &lt;YOUR_MACHINE_IP&gt;:3030
        log {
                output stdout
                format console
        }
}</code></pre><p>Change &lt;YOUR_MACHINE_IP&gt; for your internal IP address. Also change the  domain name to yours, so that it doesn&apos;t say studien.us. In the next step, we&apos;ll put the ghost software on the port 3030.</p><p></p><p>What&apos;s beautiful about this is that the Caddy docker image uses a service called Let&apos;s Encrypt. Let&apos;s Encrypt has a process where it checks your DNS records. It compares the machine&apos;s external IP address with those DNS records. If everything works out, you&apos;ll get the SSL for your blog subdomain. That means http<strong>s</strong> will work, and you&apos;ll get SSL termination at the reverse proxy.</p><p></p><h2 id="ghost">Ghost</h2><hr><p>The ghost setup is actually stupid simple. Open the docker-compose.yml file that you created for Caddy and append the following to it.</p><pre><code class="language-yaml">ghost:
    image: ghost:5.80
    restart: always
    ports:
      - 3030:2368
    environment:
      database__client: mysql
      database__connection__host: ghostdb
      database__connection__user: root
      database__connection__password: &lt;DB_PASSWORD_FROM_BELOW&gt;
      database__connection__database: ghost
      url: https://blog.studien.us
    volumes:
      - /mnt/ghost/app:/var/lib/ghost/content
  ghostdb:
    image: mysql:8.0
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: &lt;CREATE_A_DB_PASSWORD_HERE&gt;
    volumes:
      - /mnt/ghost/db:/var/lib/mysql</code></pre><p>Replace the password information with <strong>same</strong> password in each of those values above. Also change the url to match yours.</p><p>Finally, you can simply use docker compose on that file to up the blog. In a minute or two, you can navigate to your url set above. Hopefully, you should see the blog now. <br></p><p>To create an admin account, you&apos;ll need to go to route /ghost. That would make my admin URL endpoint https://blog.studien.us/ghost. In here, you can customize the UI, create posts, and much much more. </p><p>I&apos;ll leave the rest up to you!</p>]]></content:encoded></item><item><title><![CDATA[Coming soon]]></title><description><![CDATA[<p>This is Blue Sky Projects, a brand new site by Cameron Anderson that&apos;s just getting started. Things will be up and running here shortly, but you can <a href="#/portal/">subscribe</a> in the meantime if you&apos;d like to stay up to date and receive emails when new content is</p>]]></description><link>https://blog.studien.us/coming-soon/</link><guid isPermaLink="false">65e513f1b8c7c80001ccc1f2</guid><category><![CDATA[News]]></category><dc:creator><![CDATA[Cameron Anderson]]></dc:creator><pubDate>Mon, 04 Mar 2024 00:21:05 GMT</pubDate><media:content url="https://static.ghost.org/v4.0.0/images/feature-image.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://static.ghost.org/v4.0.0/images/feature-image.jpg" alt="Coming soon"><p>This is Blue Sky Projects, a brand new site by Cameron Anderson that&apos;s just getting started. Things will be up and running here shortly, but you can <a href="#/portal/">subscribe</a> in the meantime if you&apos;d like to stay up to date and receive emails when new content is published!</p>]]></content:encoded></item></channel></rss>