<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[geekmonkey]]></title><description><![CDATA[Programming · Research · Technology]]></description><link>https://geekmonkey.org/</link><generator>Ghost 5.79</generator><lastBuildDate>Fri, 23 Feb 2024 03:38:21 GMT</lastBuildDate><atom:link href="https://geekmonkey.org/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Detecting unused database columns using Ecto schemas]]></title><description><![CDATA[<p>When building large database-backed Elixir applications using Ecto, it is inevitable for your database schema to evolve over time. As your database schema changes, it&apos;s possible for there to be discrepancies between the tables in the database and your local schema definitions. Tracking inconsistencies between the schema defined</p>]]></description><link>https://geekmonkey.org/detecting-unused-database-columns-using-ecto-schemas/</link><guid isPermaLink="false">644b7974dc7f82003dcb81c8</guid><category><![CDATA[Elixir]]></category><category><![CDATA[PostgreSQL]]></category><category><![CDATA[Databases]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Fri, 28 Apr 2023 14:01:28 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1631624217902-d14c634ab17c?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDl8fGRhdGFiYXNlfGVufDB8fHx8MTY4MjYyOTgzOQ&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1631624217902-d14c634ab17c?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDl8fGRhdGFiYXNlfGVufDB8fHx8MTY4MjYyOTgzOQ&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" alt="Detecting unused database columns using Ecto schemas"><p>When building large database-backed Elixir applications using Ecto, it is inevitable for your database schema to evolve over time. As your database schema changes, it&apos;s possible for there to be discrepancies between the tables in the database and your local schema definitions. Tracking inconsistencies between the schema defined in your application&apos;s modules and what&apos;s actually in the database can be tedious.</p><p>Thankfully, there&apos;s a way we can look for inconsistencies programmatically.</p><h3 id="finding-all-schema-modules">Finding all schema modules</h3><p>A normal Ecto schema will look something like the following:</p><pre><code class="language-elixir">defmodule User do
  use Ecto.Schema

  schema &quot;users&quot; do
    field :name, :string
    field :age, :integer, default: 0
    field :password, :string, redact: true
    has_many :posts, Post
  end
end</code></pre><p><code>schema</code> is actually a macro, that, when expanded, will declare a number of functions. We can use the presence of the <code>__schema__</code> function to filter our application&apos;s modules to grab only those that have a schema definition.</p><p>To get all modules defined in your application we can do:</p><figure class="kg-card kg-code-card"><pre><code class="language-elixir">{:ok, modules} = :application.get_key(:your_app, :modules)</code></pre><figcaption>Obtain a list of all modules in our OTP application</figcaption></figure><p>Now we want to filter out modules without the <code>__schema__</code> function:</p><figure class="kg-card kg-code-card"><pre><code class="language-elixir">schema_modules =
  modules 
  |&gt; Enum.filter(&amp;({:__schema__, 1} in &amp;1.__info__(:functions))) </code></pre><figcaption>Filter out modules that do not have a schema definition</figcaption></figure><p>This will give us all modules where the <code>__schema__</code> field is present. This will include modules that define embedded schemas however. Since embedded schemas don&apos;t directly map to a database table we&apos;ll want to exclude those modules. Taking a look at the implementation of the <code>schema</code> and <code>embedded</code> macros reveals that only <code>schema</code> definitions will add an additional <code>__meta__</code> field to the module struct. </p><figure class="kg-card kg-code-card"><pre><code class="language-elixir">  |&gt; Enum.filter(&amp;(:__meta__ in Map.keys(&amp;1.__schema__(:loaded))))</code></pre><figcaption>Further filter out embedded schemas</figcaption></figure><h3 id="comparing-the-schemas-with-the-database">Comparing the schemas with the database</h3><p>We can now iterate over all schema modules. Let&apos;s define a function for this:</p><pre><code class="language-elixir">defmodule SchemaChecker do
  def schema_modules do
    {:ok, modules} = :application.get_key(:your_app, :modules)
    
    modules 
    |&gt; Enum.filter(&amp;({:__schema__, 1} in &amp;1.__info__(:functions))) 
    |&gt; Enum.filter(&amp;(:__meta__ in Map.keys(&amp;1.__schema__(:loaded))))
  end
  
  def check_module(module) do
    [...]
  end
end</code></pre><p>To obtain the list of fields on each schema we can continue using the <code>__schema__</code> function. </p><pre><code class="language-elixir">fields = module.__schema__(:fields)</code></pre><p>This is almost perfect, but fields in our schema can have different names than the database column. For this we need to get the <code>:field_source</code>, so we change the above to:</p><pre><code class="language-elixir">fields = 
  module.__schema__(:fields)
  |&gt; Enum.map(&amp;module.__schema__(:field_source, &amp;1))
  |&gt; Enum.map(&amp;Atom.to_string/1)</code></pre><p>You&apos;ll note that we also convert the atoms to strings. This is to more easily compare the lists later on.</p><p>We can now run a simple dummy query against our database with a <code>LIMIT 0</code> that will return an empty result set but list out all available columns. By computing the difference between our list of fields and the database columns we can determine which columns are no longer referenced by any schema:</p><pre><code class="language-elixir">table_name = module.__schema__(:source)

{:ok, %{columns: columns}} = DB.Repo.query(&quot;SELECT * FROM #{table_name} LIMIT 0&quot;)

columns -- fields</code></pre><p>You may have to play around with the result set a bit in case you are not using PostgreSQL. </p><h3 id="summary">Summary</h3><p>This post is the result of several database cleanup efforts in my dayjob. I was positively surprised how easy it was to leverage Ecto&apos;s powerful API and well-written documentation to get this task done. </p><p>You can find the final version at: <a href="https://gist.github.com/halfdan/6853a8cc8994eca4c9311ecad04a0eb0?ref=geekmonkey.org">https://gist.github.com/halfdan/6853a8cc8994eca4c9311ecad04a0eb0</a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Names should be descriptive, not cute]]></title><description><![CDATA[<blockquote>&#x2018;There are only two hard things in Computer Science: cache invalidation and naming things.&#x2019;<br>- Phil Karlton</blockquote><p>Mixmaster, Broadside, Windcharger, Ransack, Ramjet, Wheeljack, Skyfire, Slingshot and Wideload. Those were project names I had to deal with in my previous role. Some of those projects were backend services, some</p>]]></description><link>https://geekmonkey.org/names-should-be-descriptive-not-cute/</link><guid isPermaLink="false">63be7777683ed4003d187674</guid><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Wed, 11 Jan 2023 15:30:39 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1598348341635-33a3f4205d32?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDd8fHRyYW5zZm9ybWVyc3xlbnwwfHx8fDE2NzM0MzA2MjA&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<blockquote>&#x2018;There are only two hard things in Computer Science: cache invalidation and naming things.&#x2019;<br>- Phil Karlton</blockquote><img src="https://images.unsplash.com/photo-1598348341635-33a3f4205d32?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDd8fHRyYW5zZm9ybWVyc3xlbnwwfHx8fDE2NzM0MzA2MjA&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" alt="Names should be descriptive, not cute"><p>Mixmaster, Broadside, Windcharger, Ransack, Ramjet, Wheeljack, Skyfire, Slingshot and Wideload. Those were project names I had to deal with in my previous role. Some of those projects were backend services, some libraries and some were for a frontend. </p><blockquote>&quot;Why is there a <strong>RansackRunner</strong> in <strong>Windcharger </strong>that actually runs <strong>Ramjet</strong>?<br>- Some engineer</blockquote><p>Unlike the author of &quot;<a href="https://ntietz.com/blog/name-your-projects-cutesy-things/?ref=geekmonkey.org">Names should be cute, not descriptive</a>&quot; I believe the opposite to be true.</p><p>The main arguments from the linked article can be summarized as:</p><ul><li>Projects change and names may no longer fully reflect the responsibility of the project</li><li>Descriptive names are hard to say and remember and are not &quot;fun&quot; </li></ul><p>Names serve a major purpose: They help everyone involved in a project understand what is being talked about. In a company with many projects in can quickly get confusing when you have tens of &quot;cute&quot; project names thrown around on a daily basis. Almost every month I would have to spend time in meetings explaining the difference between two similar sounding &quot;cute&quot; project names and reminding everyone what those projects actually did. Additionally, while the engineers would often refer to the projects by their cute name, the PMs would not bother remembering the names and instead talk about &quot;the backend&quot; and &quot;the frontend&quot; which caused significant friction and confusion on both sides.</p><p>The moment we switched to more descriptive names this changed. By following the Domain Driven Design&apos;s &quot;<a href="https://thedomaindrivendesign.io/developing-the-ubiquitous-language/?ref=geekmonkey.org">Ubiquitous Language</a>&quot; between our different functions we were able to find names that were easy to remember, specific enough for our business context but generic enough not to limit the projects&apos; growth.</p><p>Not only was communication easier with the descriptive project names it also helped onboarding new engineers and PMs who now didn&apos;t have to spend the first few weeks on the job producing their version of an up to date glossary to map codenames to &quot;what does this thing do&quot;. &#xA0;</p><p>Projects would still evolve and the names would eventually longer fully describe the responsibility of the underlying service or library. When this happened there were three options:</p><ul><li>alter the name to once again fully capture the responsibility of our project</li><li>refactor the project and split it up</li><li>keep the name as it was still &quot;good enough&quot;</li></ul><p>Renaming projects is often harder than finding the name in the first place because project names percolate into many places such as environment variables, class names and functions. For this reason we never actually renamed a project, we simply phased it our over time and replaced it with a project with a more descriptive name. </p><p>It&apos;s rare and speaks of bad decision making when a project suddenly does a 180 and somehow no longer matches any of the original intent embodied by its name. When you hide behind cute project names it&apos;s easy to let something like this slip, but when you have a descriptive name it allows you to catch problems like this early on.</p><p>Sometimes though the name is simply still accurate enough and an argument could be made that feature X is close enough in context for it end up in the specific project. This is the case when a feature is merely extending the original project and doesn&apos;t introduce something completely new.</p><p>There may be a middle ground to all of this. If you are a company with a massive monolithic application then the name will probably not matter too much. It will be fine to give it a &quot;cutesy&quot; name. The moment you expand to multiple codebases or even microservices I strongly suggest sprinkling some descriptiveness into the name. It&apos;s entirely valid to combine cute and descriptive and produce a name like &quot;godzilla-auth-service&quot;. </p><!--kg-card-begin: html--><s>The world is boring enough as is. Let&apos;s add more whimsy and cuteness through our service and project names.</s><!--kg-card-end: html--><p>The world is pretty ok. Let&apos;s be kind to each other and use descriptive names and not make work harder than it needs to be.</p>]]></content:encoded></item><item><title><![CDATA[Setting up PyTorch on Mac M1 GPUs (Apple Metal / MPS)]]></title><description><![CDATA[<p>I&apos;m currently taking the Deep Learning nano degree offered by Udacity to deepen my knowledge of modern Machine Learning and Deep Learning.</p><p>The course walks through the basics of ML and introduces PyTorch and will require knowledge of it in later stages. </p><p>This post will be short and</p>]]></description><link>https://geekmonkey.org/setting-up-jupyter-lab-with-pytorch-on-a-mac-with-gpu/</link><guid isPermaLink="false">6376463f4fbb84003d5d6fdc</guid><category><![CDATA[Python]]></category><category><![CDATA[Machine Learning]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Fri, 18 Nov 2022 08:59:02 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1591453089816-0fbb971b454c?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDh8fG1hY2hpbmUlMjBsZWFybmluZ3xlbnwwfHx8fDE2Njg3NjE4NzQ&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1591453089816-0fbb971b454c?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDh8fG1hY2hpbmUlMjBsZWFybmluZ3xlbnwwfHx8fDE2Njg3NjE4NzQ&amp;ixlib=rb-4.0.3&amp;q=80&amp;w=2000" alt="Setting up PyTorch on Mac M1 GPUs (Apple Metal / MPS)"><p>I&apos;m currently taking the Deep Learning nano degree offered by Udacity to deepen my knowledge of modern Machine Learning and Deep Learning.</p><p>The course walks through the basics of ML and introduces PyTorch and will require knowledge of it in later stages. </p><p>This post will be short and sweet and simply walk you through the steps you need to take to make use of PyTorch using a standard Python installation and have a GPU enabled. This is all possible with PyTorch nightly which introduces a new <a href="https://pytorch.org/docs/master/notes/mps.html?ref=geekmonkey.org">MPS backend</a>:</p><blockquote>The new MPS backend extends the PyTorch ecosystem and provides existing scripts capabilities to setup and run operations on GPU.</blockquote><p>This was previously announced on the <a href="https://pytorch.org/blog/introducing-accelerated-pytorch-training-on-mac/?ref=geekmonkey.org">PyTorch Blog</a> and is a good read in and by itself. The expected improvement over regular CPU training/evaluation looks spectacular, but </p><h2 id="installing-python">Installing Python</h2><p>I use <a href="https://asdf-vm.com/?ref=geekmonkey.org">asdf</a> to manage the languages I work with. It can be installed with <code>brew install asdf</code>. This adds a small shell script that you can use to install plugins and individual versions. We&apos;ll start by adding the Python plugin:</p><pre><code class="language-sh">asdf plugin add python</code></pre><p>We can then list all available versions using:</p><pre><code class="language-sh">asdf list all python</code></pre><p>To install a version and make it the global default we simply run:</p><pre><code class="language-sh">asdf install python 3.10.7
asdf global python 3.10.7</code></pre><p>Refer to the <a href="https://asdf-vm.com/guide/getting-started.html?ref=geekmonkey.org#_1-install-dependencies">asdf documentation</a> to learn more about all its features and supported plugins.</p><p>Note that at the time of this writing PyTorch is <a href="https://github.com/pytorch/pytorch/issues/86566?ref=geekmonkey.org">not compatible</a> with Python 3.11 - we&apos;ll use the latest 3.10.x release instead.</p><h2 id="setting-up-our-work-environment">Setting up our work environment</h2><p>For the first project I want to set up a virtual environment with all the core dependencies and a Jupyter Notebook installation for easy exploration. We&apos;ll start by creating a virtual environment and activating it:</p><figure class="kg-card kg-code-card"><pre><code class="language-sh">python -m venv .venv
. .venv/bin/activate</code></pre><figcaption>Create virtual environment and activate it</figcaption></figure><p>From here we can use the <a href="https://pytorch.org/get-started/locally/?ref=geekmonkey.org">configuration tool</a> on the PyTorch website to get the installation command for the nightly version.</p><figure class="kg-card kg-image-card"><img src="https://geekmonkey.org/content/images/2022/11/image.png" class="kg-image" alt="Setting up PyTorch on Mac M1 GPUs (Apple Metal / MPS)" loading="lazy" width="800" height="298" srcset="https://geekmonkey.org/content/images/size/w600/2022/11/image.png 600w, https://geekmonkey.org/content/images/2022/11/image.png 800w" sizes="(min-width: 720px) 720px"></figure><pre><code class="language-sh">pip3 install --pre torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/nightly/cpu</code></pre><p>With this we should have PyTorch installed and ready to go. We can validate our installation in the Python REPL:</p><figure class="kg-card kg-code-card"><pre><code class="language-sh">$ python
Python 3.10.7 (main, Nov 17 2022, 15:26:39) [Clang 14.0.0 (clang-1400.0.29.202)] on darwin
Type &quot;help&quot;, &quot;copyright&quot;, &quot;credits&quot; or &quot;license&quot; for more information.
&gt;&gt;&gt; import torch
&gt;&gt;&gt; torch.backends.mps.is_available()
True
</code></pre><figcaption>The MPS backend is available</figcaption></figure><p>At this point I was able to follow the PyTorch tutorial and leverage my GPU. In places where the tutorial references a CUDA device, you can simply use the <code>mps</code> device.</p><p>I&apos;m excited to have a powerful GPU readily available on my machine without the need to build a separate rig with CUDA cores. Given that I&apos;m new to this field I&apos;m not sure how far the Apple GPUs will take me, but for now they can support me in my learning journey.</p>]]></content:encoded></item><item><title><![CDATA[Rethink your git workflow with git-worktree]]></title><description><![CDATA[Git is a powerful tool used by almost everyone in the the tech industry. Worktrees are an often overlooked feature that is presented in this article.]]></description><link>https://geekmonkey.org/rethink-your-git-workflow-with-git-worktree/</link><guid isPermaLink="false">61fa56c6215996003bc915a7</guid><category><![CDATA[Git]]></category><category><![CDATA[Neovim]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Fri, 18 Feb 2022 17:02:34 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1556075798-4825dfaaf498?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDN8fGdpdHxlbnwwfHx8fDE2NDUzMDA5OTM&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1556075798-4825dfaaf498?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDN8fGdpdHxlbnwwfHx8fDE2NDUzMDA5OTM&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Rethink your git workflow with git-worktree"><p>Git is an immensely powerful tool with a vast set of features. Worktrees are one feature that was unknown to me until recently, and boy do I wish I had discovered it sooner. </p><p>When working in a team, especially when working on larger codebases, it will be common to switch branches frequently. I usually <code>git stash</code> any uncommitted changes or simply <code>git commit -m &quot;wip&quot;</code> them. This can quickly get out of control when I switch multiple times and then become complicated when juggling numerous different stashes. </p><p>Git worktrees are a fantastic solution to improve this workflow. It&apos;s not a new feature by any standard. It has been around for years.</p><h2 id="what-is-git-worktree">What is git worktree?</h2><blockquote>&quot;Manage multiple working trees attached to the same repository.&quot;</blockquote><p>When you <code>git switch &lt;branch&gt;</code> to another branch, your working directory will be changed in place. The changes from the other branch are no longer directly accessible. This is precisely why you have to stash or commit any changes before switching. With a normal <code>git clone</code> you get exactly one working directory.</p><p>With git worktree you can have multiple directories for the same repository. Each worktree will point to a different commit/branch. This opens up the ability to quickly switch between branches without interrupting your current work. </p><p>I personally started using worktrees to check out code from other people&apos;s PRs for review or development or to simply switch to a different set of work when I get stuck on something.</p><h2 id="usage">Usage</h2><p>Git worktree requires that you create a bare clone of your repository. This will look a bit weird if you haven&apos;t seen it before. Instead of a <code>.git</code> directory alongside your code, you will have a directory with all the contents that normally go into <code>.git</code>.</p><pre><code>git clone --bare https://github.com/golang/go.git</code></pre><p>With this on your disc, you can now create a worktree by typing:</p><pre><code>git worktree add master</code></pre><p>This will create a new folder inside your bare clone with the name <code>master</code> that will contain all the code from the master branch. From there, you can interact with your code and git just like you usually would.</p><p>At this point, we&apos;ve just reached feature parity. The usefulness of this feature becomes apparent the moment you create a second worktree, i.e., to work on a separate ticket:</p><pre><code>git worktree add feat-123</code></pre><p>You now have two separate versions of your code side by side that you can work on without having to stash or commit code when switching between branches.</p><p>We can just as easily remove worktrees we no longer need with <code>git worktree remove feat-123</code></p><h2 id="editor-support">Editor Support</h2><p>Despite their usefulness, I haven&apos;t found great support for worktrees in code editors. There&apos;s an <a href="https://github.com/microsoft/vscode/issues/68038?ref=geekmonkey.org">open issue</a> for VSCode requesting support for worktrees, but it seems it&apos;s lacking community support to get something shipped.</p><p>There exists a <a href="https://github.com/ThePrimeagen/git-worktree.nvim?ref=geekmonkey.org">plugin</a> for neovim however, that was written by popular twitch.tv streamer <a href="https://www.twitch.tv/theprimeagen?ref=geekmonkey.org">ThePrimeagen</a>. This plugin allows rapid switching between and managing worktrees from within neovim. He even explains the concept and his plugin in a video on Youtube:</p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/2uEqYw-N8uE?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure><h2 id="recap">Recap</h2><p>Git worktree is a hidden gem that is unknown to many. They can drastically help simplify development workflows by allowing users to switch between branches without committing their work quickly.</p><p>I&apos;m almost mad I hadn&apos;t discovered this feature of git sooner. The number of times I have lost some of my work in a messy git stash could have easily been zero had I used worktrees from the get-go.</p>]]></content:encoded></item><item><title><![CDATA[Learning to type Dvorak]]></title><description><![CDATA[<p>Earlier this year, and for the first time in my professional life, I experienced pain caused by repetitive stress injury (RSI). An incorrect desk setup or keyboard can have many detrimental effects on your ability to sustain long hours in front of a computer. With RSI not wanting to go</p>]]></description><link>https://geekmonkey.org/learning-to-type-dvorak/</link><guid isPermaLink="false">61af6948fb105e0048ce0fda</guid><category><![CDATA[Keyboards]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Mon, 03 Jan 2022 15:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1598428452014-264f8ff1b174?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDF8fHNwbGl0JTIwa2V5Ym9hcmR8ZW58MHx8fHwxNjQ0MjY5NTU0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1598428452014-264f8ff1b174?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDF8fHNwbGl0JTIwa2V5Ym9hcmR8ZW58MHx8fHwxNjQ0MjY5NTU0&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Learning to type Dvorak"><p>Earlier this year, and for the first time in my professional life, I experienced pain caused by repetitive stress injury (RSI). An incorrect desk setup or keyboard can have many detrimental effects on your ability to sustain long hours in front of a computer. With RSI not wanting to go away, I was on the market for a new, more ergonomic keyboard that put less stress on my forearms. </p><p>As I was traveling shortly after experiencing RSI-related issues, I had time to research and explore different keyboard options. For the last few years, I&apos;ve been typing away on a pretty standard flat Apple magic wireless keyboard which I&apos;ve enjoyed. The keyboard is super flat and portable, easy to stow away in a bag and its battery lasts forever. Combined with the magic trackpad, it&apos;s been a good travel companion. </p><p>Initially, I bought a Ducky One 2 Mini RGB. Not at all an ergonomic upgrade, but I thought it would be enough of a change to relieve me of the pain. After just a few days of typing on it, I could already tell this wasn&apos;t working for me. I also struggled to get used to not having arrow keys available and ended up not liking the high keys as much as I thought.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/12/duckyone2mini.jpg" class="kg-image" alt="Learning to type Dvorak" loading="lazy" width="1974" height="684" srcset="https://geekmonkey.org/content/images/size/w600/2021/12/duckyone2mini.jpg 600w, https://geekmonkey.org/content/images/size/w1000/2021/12/duckyone2mini.jpg 1000w, https://geekmonkey.org/content/images/size/w1600/2021/12/duckyone2mini.jpg 1600w, https://geekmonkey.org/content/images/2021/12/duckyone2mini.jpg 1974w" sizes="(min-width: 720px) 720px"><figcaption>Ducky One 2 Mini RGB</figcaption></figure><h3 id="qwertzqwerty">QWERTZ/QWERTY</h3><p>The first computer we owned in 1998 came with a standard German keyboard layout. Once I got into programming, I very quickly converted to the English QWERTY layout as the keys we use as programmers are much more accessible with this layout. That was probably around 2001 or 2002 when I got my own computer in my room. </p><p>I never correctly learned touch typing; I was an auto-didact. I mainly use my index, middle, and ring fingers for typing. My thumbs both rest on the space bar, and I&apos;ll use them for keyboard shortcuts that include the Option or Command key. With shortcuts that use the Function or Control key, I&apos;ll switch to my left little finger, which I also use to press the Shift key. Further up the keyboard, I will use my ring finger to hit Tab or backspace. My right little finger will only hit enter. It&apos;s worth mentioning that my right Shift key goes completely unused.</p><p>With this weird, self-taught typing style, I manage to hit speeds of 110 WPM with an average somewhere in the low-00s with an accuracy around 98-99%. This seems to place me well above average professional typists.</p><p>All that out of the way, I still feel like my typing isn&apos;t particularly efficient. My fingers jump all over the keyboard, and I have to make some awkward moves to nail specific keyboard shortcuts.</p><h3 id="dvorak-touch-typing">Dvorak &amp; Touch Typing</h3><p>This finally brings me to the Kinesis Advantage 2 QD. I&apos;ve first come across this keyboard in a review by <a href="https://martinfowler.com/articles/kinesis-advantage2.html?ref=geekmonkey.org">Martin Fowler.</a> After more research and a deep look into my pockets, I decided to give this keyboard a shot. After all, not being able to type because of wrist pain would eventually cost me more than this keyboard.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/12/kinesis-advantage2-qd.jpg" class="kg-image" alt="Learning to type Dvorak" loading="lazy" width="976" height="497" srcset="https://geekmonkey.org/content/images/size/w600/2021/12/kinesis-advantage2-qd.jpg 600w, https://geekmonkey.org/content/images/2021/12/kinesis-advantage2-qd.jpg 976w" sizes="(min-width: 720px) 720px"><figcaption>Kinesis Advantage 2 QD</figcaption></figure><p>I was always quite intrigued by weird-looking ergonomic keyboards. Sometimes you&apos;d find them split into two halves, sometimes they&apos;re propped up, and then there&apos;s the Advantage 2 with its strange hollow body.</p><p>Similarly, I always wanted to learn Dvorak. Years ago, I tried to learn it for a few weeks, but it simply never stuck with me. Since I would have to learn to touch type on the Advantage 2, I might as well learn Dvorak with it. </p><p>Luckily, Kinesis offers the QD model, which, and you can see this if you look closely enough, comes with the keys labeled for QWERTY and Dvorak. While I never really look down while typing, I figured this would be handy for the first few weeks when learning Dvorak.</p><p>Progress has admittedly been slow, and I had to switch back and forth between my new and my old keyboard to stay productive at work. I have been using keybr.com to practice typing Dvorak as it&apos;s too difficult for me to remain productive while adjusting to the new keyboard and keyboard layout. Below you can see my current progress after eight hours of practice (plus some additional time I just spent typing away on Slack with the new keyboard).</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/12/image-2.png" class="kg-image" alt="Learning to type Dvorak" loading="lazy" width="1268" height="546" srcset="https://geekmonkey.org/content/images/size/w600/2021/12/image-2.png 600w, https://geekmonkey.org/content/images/size/w1000/2021/12/image-2.png 1000w, https://geekmonkey.org/content/images/2021/12/image-2.png 1268w" sizes="(min-width: 720px) 720px"><figcaption>Typing speed after 8 hours of practicing with Dvorak on the Kinesis Advantage 2</figcaption></figure><p>There&apos;s visible progress, and I have just added the last remaining letters to the practice set. I&apos;m nowhere near my average typing speed, but I&apos;m also just using the keyboard for about an hour every day so far. </p><p>My pain is gone, however, so I&apos;m more than happy to have made the switch. The keyboard takes a few days to get used to. There&apos;s a lot of muscle memory to rewire when you&apos;ve used a regular keyboard for two decades, so I often found myself hitting backspace when I wanted to hit space. </p><p>It&apos;s fascinating to watch the brain make new connections while I&apos;m not at the keyboard. I&apos;d get frustrated after an hour of practice making more and more mistakes, but coming back the next day and I can immediately type faster, make fewer errors while using new keys. </p><p>The keyboard forces you to touch type and the experience so far is that my fingers flow a lot smoother over the keyboard. Without touch typing, it sometimes feels as if my fingers were performing some kind of breakdance.</p><h3 id="real-programmer-dvorak">(Real) Programmer Dvorak</h3><p>Dvorak is optimized for English text. While my main communication language is English, as a programmer I frequently need special characters. This exact need has brought on several variations of Dvorak called &quot;<a href="https://www.kaufmann.no/roland/dvorak/?ref=geekmonkey.org">Programmer Dvorak</a>&quot; and &quot;<a href="https://github.com/ThePrimeagen/keyboards?ref=geekmonkey.org">Real Programmer&apos;s Dvorak</a>&quot;. Both aim to optimize the position of special characters. </p><p>I briefly tried to set up my Advantage 2 to emulate Real Programmer&apos;s Dvorak, but I haven&apos;t found it difficult to properly remap all keys on the Advantage 2 to make it work. For now, I&apos;ve decided to stick to regular Dvorak and explore additional remaps once I&apos;m more used to the layout and keyboard. </p><h3 id="resources">Resources</h3><p>There are plenty of resources to use for learning to type with a new keyboard layout. The two sites that I can wholeheartedly recommend, are both free and provide generated word sequences (or quotes). </p><ul><li><a href="https://monkeytype.com/?ref=geekmonkey.org">https://monkeytype.com/</a></li><li><a href="https://www.keybr.com/?ref=geekmonkey.org">https://www.keybr.com/</a></li></ul><p>keybr.com stands out as it will start you with a small set of letters and lets you only add more letters once you&apos;ve reached a certain WPM threshold. Other than that, the only thing required is time and practice. </p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter, where at least once a week, I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div>]]></content:encoded></item><item><title><![CDATA[It's bingo time (AoC 2021 Day 4)]]></title><description><![CDATA[<p>Our underwater journey continues. It can get dark underwater, and today is no exception. The only thing we can see apparently is a giant squid that has attached itself to the outside of our submarine. </p><p>It&apos;s only logical that we decide to play Bingo with the squid. Hey,</p>]]></description><link>https://geekmonkey.org/aoc2021-day4/</link><guid isPermaLink="false">61bc8ea54a0b5400487c10ac</guid><category><![CDATA[Advent Of Code]]></category><category><![CDATA[Golang]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Mon, 20 Dec 2021 15:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1620856516969-6b6f1c1e780b?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDJ8fGJpbmdvfGVufDB8fHx8MTYzOTgzMTQ5Nw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1620856516969-6b6f1c1e780b?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDJ8fGJpbmdvfGVufDB8fHx8MTYzOTgzMTQ5Nw&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="It&apos;s bingo time (AoC 2021 Day 4)"><p>Our underwater journey continues. It can get dark underwater, and today is no exception. The only thing we can see apparently is a giant squid that has attached itself to the outside of our submarine. </p><p>It&apos;s only logical that we decide to play Bingo with the squid. Hey, maybe we should check the oxygen generator and co2 levels again; we seem to be making strange decisions. </p><h3 id="part-1">Part 1</h3><p>The rules of Bingo are simple: A unique number is drawn from a pool. If your board contains this number, you mark it off. The first player with five numbers marked in a horizontal or vertical direction wins the game.</p><p>Since the squid is excellent at multi-tasking, it will play many boards simultaneously.</p><figure class="kg-card kg-code-card"><pre><code>7,4,9,5,11,17,23,2,0,14,21,24,10,16,13,6,15,25,12,22,18,20,8,19,3,26,1

22 13 17 11  0
 8  2 23  4 24
21  9 14 16  7
 6 10  3 18  5
 1 12 20 15 19
 
 3 15  0  2 22
 9 18 13 17  5
19  8  7 25 23
20 11 10 24  4
14 21 16 12  6</code></pre><figcaption>Example input for the bingo game</figcaption></figure><p>I&apos;ll spare you the boring part of parsing the input numbers and bingo boards; that&apos;s quickly done with a few <code>strings.Fields</code>.</p><p>For the first part, we only need to find the board that wins first. We compute its score by taking the sum of all unmarked numbers and multiplying that with the number that was last drawn.</p><p>I chose to represent the board as a struct, containing the list of numbers, a list of booleans of the same length, and a boolean flag to indicate whether the board has won already.</p><pre><code class="language-go">type Board struct {
	numbers  []int
	matched  []bool
	hasBingo bool
}</code></pre><p>After constructing all the bingo boards, we can loop over the list of drawn numbers:</p><pre><code class="language-go">Loop:
for i := range numbers {
    for b := range boards {
        if boards[b].hasBingo {
            continue
        }
        boards[b].markDigit(numbers[i])

        if boards[b].isBingo() {
            score = boards[b].sumUnmarked() * numbers[i]
            break Loop
        }
    }
}
</code></pre><p>For each new number, we call <code>markDigit</code> (implementation omitted for brevity) on each board. The method checks whether the number is on the board and marks it as <code>matched</code>. Once a board has a bingo, we compute its score and break out of the loop.</p><pre><code class="language-go">func (b *Board) isBingo() bool {
	// Check rows
	isBingo := true
	for i := 0; i &lt; len(b.matched); i += 5 {
		isBingo = true
		for j := 0; j &lt; 5; j++ {
			isBingo = isBingo &amp;&amp; b.matched[i+j]
		}
		if isBingo {
			return true
		}
	}

	// Check columns
	for i := 0; i &lt; 5; i++ {
		isBingo = true
		for j := 0; j &lt; 5; j++ {
			isBingo = isBingo &amp;&amp; b.matched[i+j*5]
		}
		if isBingo {
			return true
		}
	}
	return false
}</code></pre><p><code>sumUnmarked</code> iterates over the list of all numbers to find those that are unmarked. &#xA0;</p><pre><code class="language-go">func (b *Board) sumUnmarked() (sum int) {
	for i := range b.numbers {
		if !b.matched[i] {
			sum += b.numbers[i]
		}
	}
	return
}</code></pre><p>Let the games begin!</p><h3 id="part-2">Part 2</h3><p>In part two, we worry that it might be wiser to let the squid win, who knows what it&apos;ll do to our precious submarine if it lost. We can easily modify our loop from part one and keep track of the board that wins last. </p><pre><code class="language-go">for i := range numbers {
    for b := range boards {
        if boards[b].hasBingo {
            continue
        }
        boards[b].markDigit(numbers[i])

        if boards[b].isBingo() {
            if firstScore == 0 {
                firstScore = boards[b].sumUnmarked() * numbers[i]
            }
            lastScore = boards[b].sumUnmarked() * numbers[i]
            boards[b].hasBingo = true
        }
    }
}
</code></pre><p>I don&apos;t think I&apos;ve ever solved the second part of any Advent of Code puzzle quite this quickly.</p><h2 id="wrap-up">Wrap up</h2><p>There&apos;s little to say about today&apos;s puzzle. It was an easy puzzle and serves as a good exercise to practice some basic programming. &#xA0;</p><p>A possible optimization for <code>isBingo</code> would be using a binary representation &#xA0;<code>marked</code> since we could then leverage bitmasks. Given that we&apos;re dealing with a low number of boards and performance wasn&apos;t an issue, I&apos;ll leave that as an exercise for later.</p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div>]]></content:encoded></item><item><title><![CDATA[Advent of Code 2021 - Day 3]]></title><description><![CDATA[<p>Deep under the water, we find ourselves hearing creaking noises. Our submarine handily gives us a diagnostics report. Sadly, as is oft too common in AoC, the report format is complete garbage, to put it mildly.</p><p>Our puzzle input comes in the form of binary numbers. The task is to</p>]]></description><link>https://geekmonkey.org/aoc2021-day3/</link><guid isPermaLink="false">61b7814c2cf4c600486f3606</guid><category><![CDATA[Advent Of Code]]></category><category><![CDATA[Golang]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Fri, 17 Dec 2021 15:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1608039192991-e5f900563754?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE2fHxhZHZlbnR8ZW58MHx8fHwxNjM5ODMxNDgz&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1608039192991-e5f900563754?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDE2fHxhZHZlbnR8ZW58MHx8fHwxNjM5ODMxNDgz&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Advent of Code 2021 - Day 3"><p>Deep under the water, we find ourselves hearing creaking noises. Our submarine handily gives us a diagnostics report. Sadly, as is oft too common in AoC, the report format is complete garbage, to put it mildly.</p><p>Our puzzle input comes in the form of binary numbers. The task is to compute a <strong>gamma rate</strong> and an <strong>epsilon rate</strong> based on a simple algorithm. The example report given to us looks like this:</p><pre><code>00100
11110
10110
10111
10101
01111
00111
11100
10000
11001
00010
01010</code></pre><p>Instead of taking these binary numbers as-is, we need to construct new ones to get the gamma and epsilon rates. We do this by looking at the binary numbers bit by bit. The gamma rate results from taking the most common bit amongst all binary numbers for each position. </p><p>A short example can illustrate this much better than words, so let&apos;s take the first three binary numbers from above:</p><pre><code>00100
11110
10110
=====
10110</code></pre><p>By looking at the first bit of each number, we can see that there are two 1s and only one 0, so the first digit of the gamma rate is 1. We continue this for each digit until we get a gamma rate of <code>10110</code> or <code>22</code> in decimal. </p><p>The epsilon rate is obtained the same way, but we&apos;re taking the least common bit this time. A careful observer will notice that this can also be obtained by simply negating each bit of the gamma rate.</p><h3 id="part-1">Part 1</h3><p>Na&#xEF;vely, we can skip converting the input to their actual binary representation and work on them as ASCII characters. We can read the file into a <code>[]string</code>. For each position in the bitstring, we use a counter that increases for every 1 and decreases for every 0. </p><pre><code class="language-go">var cntr []int = make([]int, len(lines[0]))
for _, line := range lines {
	for idx, ch := range line {
		switch ch {
			case &apos;1&apos;:
				cntr[idx] += 1
			case &apos;0&apos;:
				cntr[idx] -= 1
		}
	}
}
</code></pre><p>In the next step, we can use this to construct the gamma and epsilon values using binary shifts:</p><pre><code class="language-go">var gamma, epsilon uint64
for idx, val := range cntr {
	if val &gt; 0 {
		gamma = gamma | 1 &lt;&lt; (len(cntr) - 1 - idx)
	} else {
		epsilon = epsilon | 1 &lt;&lt; (len(cntr) - 1 - idx)
	}
}</code></pre><p>We can then compute the puzzle answer as the product of gamma and epsilon.</p><h3 id="part-2">Part 2</h3><p>Part two is a bit of a tricky variation on part one. Instead of finding the most common bit value at each position, we use the most common bit as a cumulative filter for the whole list of numbers.</p><p>Initially, I solved this by building a binary tree where the edges represented a zero or a one. By traversing the tree and walking in the direction of the subtree with the most child nodes, it was easy to find the solution. </p><p>A binary isn&apos;t very efficient for this type of puzzle - after all, we&apos;re just dealing with binary numbers, and if there&apos;s something computers are fast at, it&apos;s binary.</p><p>A much faster solution comes from my friend <a href="https://github.com/javorszky/adventofcode2021/tree/main/day03?ref=geekmonkey.org#task-2">@javorszky</a>, who solved this part recursively. His algorithm makes use of bitmasks. To explain this, let&apos;s take the first four numbers:</p><pre><code>00100
11110
10110
10111</code></pre><p>For each position, we construct a bitmask. For the first position, this is simply <code>1&lt;&lt;5</code> or <code>10000</code>. Any number bigger or equal to the bitmask has a one in that position; any number smaller has a zero. We continue this until we&apos;ve filtered all the sublists down into a single element.</p><pre><code class="language-go">func filterNumbers(list []uint, pos int, larger bool) uint {
	if len(list) == 1 {
		return list[0]
	}
	mask := uint(1 &lt;&lt; pos)
	ones := []uint{}
	zeros := []uint{}

	for _, item := range list {
		if item&amp;mask &gt;= mask {
			ones = append(ones, item)
		} else {
			zeros = append(zeros, item)
		}
	}

	if (len(ones) &gt;= len(zeros) &amp;&amp; larger) || (len(ones) &lt; len(zeros) &amp;&amp; !larger) {
		return filterNumbers(ones, pos-1, larger)
	}
	return filterNumbers(zeros, pos-1, larger)
}
</code></pre><h3 id="wrapping-up">Wrapping up</h3><p>This was an interesting challenge that offered many solutions. We&apos;re not yet in the realm of puzzles that need heavily optimized algorithms, so it&apos;s great to play around a little. Implementing a binary tree from scratch isn&apos;t something I get to do very often. I&apos;m definitely looking forward to more challenging puzzles going forward, but this was another well-crafted puzzle.</p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div>]]></content:encoded></item><item><title><![CDATA[Advent of Code 2021 - Day 2]]></title><description><![CDATA[The advent of code is an annual coding challenge in the month of December. This article covers the second day's puzzle with a solution written in Go.]]></description><link>https://geekmonkey.org/aoc2021-day2/</link><guid isPermaLink="false">61b716be2cf4c600486f357c</guid><category><![CDATA[Advent Of Code]]></category><category><![CDATA[Golang]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Tue, 14 Dec 2021 15:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1606482714043-600dc0b89ae0?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDJ8fGFkdmVudCUyMGNhbGVuZGFyfGVufDB8fHx8MTYzOTQxNjE3OQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1606482714043-600dc0b89ae0?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDJ8fGFkdmVudCUyMGNhbGVuZGFyfGVufDB8fHx8MTYzOTQxNjE3OQ&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Advent of Code 2021 - Day 2"><p>Still with me? Great! Let&apos;s head on to solving day 2.</p><p>The story tells us, that we&apos;re still in the submarine and are trying to learn how to navigate it. The submarine can take a bunch of simple commands in the form of <code>direction unit</code>. </p><pre><code>forward 5
down 5
forward 8
up 3
down 8
forward 2</code></pre><p>Someone has already programmed a course for us and it&apos;s our task to figure out where we&apos;re going. We&apos;re to compute the final <em>horizontal</em> position and our final <em>depth</em>. An important note here is that <code>down</code> increases the depth while <code>up</code> decreases the depth of the submarine.</p><h3 id="part-1">Part 1</h3><p>Since I&apos;m trying to learn some concepts of Go I decided to represent the commands as a struct:</p><pre><code class="language-go">type command struct {
    direction string
    steps int
}</code></pre><p>This is easy enough. We have a direction and a number of steps in the defined direction.</p><p>We can now write go on to read our puzzle input. </p><pre><code class="language-go">func getInputs() []command {
	bytes, _ := ioutil.ReadFile(filename)

	lines := strings.Split(string(bytes), &quot;\n&quot;)
	cmds := []command{}

	for _, line := range lines {
		if len(line) == 0 {
			continue
		}
		instr := strings.Split(line, &quot; &quot;)
		val, _ := strconv.Atoi(instr[1])
		cmds = append(cmds, command{instr[0], val})
	}
	return cmds
}</code></pre><p>Once again, there&apos;s not too much magic going on here. We read the file, split it into lines, and then further split the lines into the two components representing our direction and a number. A more experienced Go programmer would probably use <code>bufio.ReadLine</code> to read the file line-by-line, but for now, I&apos;m content that this will do the job just fine.</p><p> Computing our horizontal position and depth is now as easy as looping over all commands and tallying up the steps based on the direction:</p><pre><code class="language-go">var pos, depth int

for _, cmd := range cmds {
	switch cmd.direction {
	case &quot;forward&quot;:
		pos += cmd.steps
	case &quot;down&quot;:
		depth += cmd.steps
	case &quot;up&quot;:
		depth -= cmd.steps
	}
}</code></pre><h3 id="part-2">Part 2</h3><p>The second part of the story suggests that the final position we calculated doesn&apos;t make any sense, and so an alternative algorithm is proposed. The <code>down</code> and <code>up</code> commands don&apos;t directly affect our position but represent changes to the submarine&apos;s <em>aim</em>. </p><p>Assume for a moment our instructions were:</p><pre><code>forward 5
down 1
forward 2</code></pre><p>In Part 1 this would have resulted in a horizontal position of 7 and a depth of 1. In Part 2 the interpretation changes to:</p><ul><li>Move 5 steps forward</li><li>Adjust aim to 1 (aim down!)</li><li>Move 2 more steps forward and 2 steps down in depth (since our aim is at 1 and we&apos;re stepping two times)</li></ul><p>In other words, every movement forward by X steps will change our depth by <code>X * aim</code>.</p><p>We need to modify our code ever so slightly to keep track of our sub&apos;s aim:</p><pre><code class="language-go">var aim int
pos = 0
depth = 0

for _, cmd := range cmds {
    switch cmd.direction {
    case &quot;forward&quot;:
        pos += cmd.steps
        depth += aim * cmd.steps
    case &quot;down&quot;:
        aim += cmd.steps
    case &quot;up&quot;:
        aim -= cmd.steps
    }
}</code></pre><p>That&apos;s it. We&apos;re done with part 2.</p><h3 id="summary">Summary</h3><p>Day 2 built really well on the previous day. It&apos;s good enough to practice I/O and processing text input. The task should be easy to solve even if the programming language you&apos;re using is relatively new to you. </p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div>]]></content:encoded></item><item><title><![CDATA[Advent of Code 2021 - Day 1]]></title><description><![CDATA[The advent of code is an annual coding challenge in the month of December. This article covers the first day's puzzle with a solution written in Go.]]></description><link>https://geekmonkey.org/aoc2021-day1/</link><guid isPermaLink="false">61b3798a2cf4c600486f347c</guid><category><![CDATA[Advent Of Code]]></category><category><![CDATA[Golang]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Sun, 12 Dec 2021 14:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1606482512676-255bf02be7cf?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fGFkdmVudHxlbnwwfHx8fDE2MzkxNTM5Mjk&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1606482512676-255bf02be7cf?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDR8fGFkdmVudHxlbnwwfHx8fDE2MzkxNTM5Mjk&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Advent of Code 2021 - Day 1"><p>Over the last few years, Advent of Code has gained a lot of popularity. Many developers use it as a creative way to challenge themselves or practice their skills in new programming languages. Nearly 200k people completed the first day this year.</p><p>Myself, I&apos;m using this year to learn and practice Go. While I was one of the early adopters of Go in 2009 and used it for about a year during my postgraduate studies, I have not touched it since, since my career took me in a different direction. With the rising popularity of Kubernetes and microservices, Go seems like an excellent addition to my toolset.</p><p>I&apos;ll attempt to cover all of this year&apos;s challenges and explain my thought process going into them. Since the later exercises usually take a lot of time to solve, I expect this series to run well into 2022. </p><h3 id="part-1">Part 1</h3><p>As per usual, the first day starts relatively easily. We&apos;re in a submarine with a sonar sweeping the ocean floor and recording the depth at certain intervals. The first task is to count the number of times a depth measurement increases. </p><p>We first need to read our puzzle input which is given as a sequence of positive integers with each number on a new line.</p><pre><code class="language-go">func loadMeasurements() []int {
    bytes, err := ioutil.ReadFile(filename)
    if err != nil {
        panic(err)
    }

    lines := strings.Split(string(bytes), &quot;\n&quot;)

    nums := []int{}
    for _, val := range lines[:len(lines)-1] {
        num, _ := strconv.Atoi(val)
        nums = append(nums, num)
    }

    return nums
}</code></pre><p>This snippet gives us an int-slice with all numbers from the file. It&apos;s now trivial to loop over this slice and count the number of increments:</p><pre><code class="language-go">var count, last int

measurements := loadMeasurements()
last = measurements[0]
for _, num := range measurements {
	if num &gt; last {
		count++
	}
	last = num
}
fmt.Printf(&quot;Number of increments: %d\n&quot;, count)</code></pre><p>This solves part 1. </p><h3 id="part-2">Part 2 </h3><p>We&apos;ve now concluded that looking at every single measurement isn&apos;t useful, so we decide to instead look at the last three measurements. The goal is still the same, count the number of times the sum of three measurements is bigger than the three a measurement ago.</p><p>To visualize this a little bit, imagine the numbers are: <code>1 2 3 4 5 6</code>. In this example the sum of the first three measurements <code>1 2 3</code> is <code>6</code>. In the next step, we&apos;re looking at <code>2 3 4</code> whose sum is <code>9</code>. This continues until we&apos;ve run out of numbers.</p><p>The code for this is fairly simple to write. Start with the first three measurements and then step through the entire slice one by one:</p><pre><code class="language-go">count = 0
prev := math.MaxInt
for i := 0; i &lt; len(measurements)-2; i++ {
	sum := measurements[i] + measurements[i+1] + measurements[i+2]
	if sum &gt; prev {
		count++
	}
	prev = sum
}</code></pre><p>It&apos;s important to recognize that we want to ignore the first measurement, and we can do this by initializing <code>prev</code> to <code>math.MaxInt</code> since it ensures that our first measurement can&apos;t be bigger than prev.</p><h3 id="summary">Summary</h3><p>That&apos;s it for day 1. No crazy algorithms yet, but enough to prepare you for reading the problem descriptions and doing basic IO. </p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Learn With Me: Julia - Bitwise Operators (#4)]]></title><description><![CDATA[This articles covers bitwise operators in Julia and uses them to implement a CRC32 algorithm to validate PNG data chunks.]]></description><link>https://geekmonkey.org/lwm-julia-bitwise-operators/</link><guid isPermaLink="false">60b7409f72716f003eed7b77</guid><category><![CDATA[Learn With Me: Julia]]></category><category><![CDATA[Julia]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Tue, 08 Jun 2021 14:00:00 GMT</pubDate><content:encoded><![CDATA[<p>In part 3, we learned about structs and binary I/O in Julia. We looked into how the PNG format stores metadata and how it is represented using chunks. Each chunk consisted of four components: length, type, data, and a four-byte CRC. </p><p>CRC stands for &quot;Cyclic Redundancy Check,&quot; and these checks are commonly used in digital networks and storage devices to detect transmission or read errors in data. Sometimes one or more bits can get flipped during data transmission, and CRCs are one of the tools employed to detect this.</p><p>CRC is a so-called error detection algorithm. The simplest algorithm in this class is called the parity bit. The idea with the parity bit is that you count all the 1s in a binary string, and if the number of 1s is even, the parity bit is also even. With this technique, you can detect any odd-number bit flips but are basically out of luck when an even number of bits are flipped.</p><p>In this article, we&apos;ll look at implementing CRC32 using Julia bitwise operators. The <a href="https://en.wikipedia.org/wiki/Cyclic_redundancy_check?ref=geekmonkey.org">Wikipedia</a> page is a great starting point if you want to learn a bit more before jumping into the code.</p><h3 id="bitwise-operators-in-julia">Bitwise Operators in Julia</h3><p>Like all other modern programming languages, Julia comes with bitwise operators. They apply to all primitive integer types. I don&apos;t think it&apos;s necessary for me to cover all of them here, so I&apos;ll focus on the ones we need for this post:</p><p><strong>AND</strong> denoted in Julia by an ampersand <code>&amp;</code> is true only if both operands are true.</p><pre><code>a  b  | a &amp; b
0  0  | 0
0  1  | 0
1  0  | 0
1  1  | 1</code></pre><p><strong>Bitshift</strong> is an operation that moves a sequence of bits either left or right. In Julia, the operator for this is <code>&gt;&gt;</code> or <code>&lt;&lt;</code> depending on whether you want to shift the bits right or left. Shifting left by <em>n</em> bits has the effect of multiplying a binary number by \(2^n\). Likewise shifting right by <em>n</em> bits will do the inverse and divide by \(2^n\). When used, it&apos;s important to know that the first operand is always the bits operated on, and the second denotes the number of bits to shift. <code>2 &lt;&lt; 5</code> means &quot;shift the binary representation of 2 (10) left by 5 bits (100000).</p><p><strong>XOR,</strong> also called <em>exclusive or</em> is an operator that&apos;s true if and only if the operands are different. Most languages use the <code>^</code> (caret) to symbolise XOR, Julia uses <code>&#x22BB;</code>(which can be entered using LaTeX notation <code>\xor</code>) but also offers the <code>xor</code> function.</p><pre><code>a  b  | a &#x22BB; b
-----------------
0  0  | 0
0  1  | 1
1  0  | 1
1  1  | 1</code></pre><p>With the operators covered, let&apos;s now looks at implementing the CRC verification algorithm for our PNG chunks.</p><h3 id="bitwise-crc32">Bitwise CRC32</h3><p>The simplest implementation of the CRC algorithm is operating bitwise. CRC is essentially an implementation of polynomial division where our data is the dividend, and the result of the division is the CRC for our data. Don&apos;t worry, we don&apos;t have to get super mathematical here - in binary polynomial division can easily be done using XOR. </p><p>For CRC32 the generator polynomial, or divisor, is $$x^{32} + x^{26} + x^{23} + x^{22} + x^{16} + x^{12} + x^{11} + x^{10} + x^{8} + x^{7} + x^{5} + x^{4} + x^{2} + x + 1$$ which can also be written in hex as <em>0x04C11DB7</em>. In practice this is often reversed since it makes the algorithm easier to implement in software. The resulting constant is <em>0xEDB88320</em>. </p><figure class="kg-card kg-code-card"><pre><code class="language-C">unsigned int crc32(unsigned char *message) {
    int i, j;
    unsigned int byte, crc, mask;
    
    i = 0;
    crc = 0xFFFFFFFF;
    while (message[i] != 0) {
    	byte = message[i];          // Get next byte.
        crc = crc ^ byte;
        for (j = 7; j &gt;= 0; j--) {  // Do eight times.
            mask = -(crc &amp; 1);
            crc = (crc &gt;&gt; 1) ^ (0xEDB88320 &amp; mask);
        }
        i = i + 1;
    }
    return ~crc;
}</code></pre><figcaption>Basic bitwise CRC32 algorithm, taken from <a href="https://amzn.to/3wKTuAW?ref=geekmonkey.org">Hacker&apos;s Delight</a></figcaption></figure><p>The above implementation appears in one of my favorite algorithm books written by Henry S. Warren Jr., called <a href="https://amzn.to/3wKTuAW?ref=geekmonkey.org">Hacker&apos;s Delight</a>. It features many other algorithms and techniques, and its most recent edition has a whole section on CRC32.</p><p>We can start by converting the C implementation from Hacker&apos;s Delight to Julia. Julia strings contain variable-length characters, so it is necessary to ensure they are converted to <code>UInt8</code> first, since the algorithm can only operate on individual bytes.</p><pre><code class="language-julia">function crc32(message::Vector{UInt8})
    crc = 0xFFFFFFFF
    for byte in message
        crc = crc &#x22BB; byte
        for _ in 1:8
            mask = -UInt32(crc &amp; 1)
            crc = (crc &gt;&gt; 1) &#x22BB; (0xEDB88320 &amp; mask)
        end
    end
    ~crc
end</code></pre><p>We can trivially test this by picking any chunk from a PNG file and passing the type and data fields in as the message. The easiest one I found was the IEND chunk since it doesn&apos;t even have any data. This will always have the same CRC of 0xAE426082. Running the following snippet, we can see our code working:</p><pre><code class="language-julia">message = &quot;IEND&quot;
crc = crc32(Vector{UInt8}(message))  # Returns a UInt32
print(string(crc, base=16)) # Prints: ae426082
</code></pre><p>Done! Or are we...?</p><h3 id="bytewise-crc32">Bytewise CRC32</h3><p>For PNG, there&apos;s a <a href="https://www.w3.org/TR/PNG-CRCAppendix.html?ref=geekmonkey.org">reference implementation</a> available in C that operates byte-wise instead of bitwise, allowing for a drastic speedup in CRC computation. We can adapt it to work in Julia as well.</p><p>The reference implementation uses a pre-built table to speed up performance. This table stores all possible combinations for a single byte resulting in a table of 256 elements. There are other implementations out there that extend this to 2 bytes or 65,536 elements.</p><pre><code class="language-julia">function maketable()
    crc_table = Vector{UInt32}(undef, 256)

    for n in 0:255
        c = convert(UInt32, n)
        for _ in 1:8
           if c &amp; 1 == 1
               c = 0xEDB88320 &#x22BB; (c &gt;&gt; 1)
           else
               c = c &gt;&gt; 1
           end
        end
        crc_table[n+1] = c
    end
    crc_table
end</code></pre><p>Note that since Julia arrays are 1-indexed, we have to adapt the algorithm to take care of this.</p><p>Since the table is reusable (as long as we use the same generator polynomial), it&apos;s advisable to cache it somehow. One way of doing this is using a const variable in the global scope:</p><pre><code class="language-julia">const crc_table = maketable()</code></pre><p>The main CRC32 algorithm can now use this table to conveniently iterate over the input data byte-by-byte and save many compute cycles. Note that we had to <code>+ 1</code> the lookup here (as compared with the C version) to accommodate Julia&apos;s 1-indexed arrays.</p><pre><code class="language-julia">function crc32(data::Vector{UInt8}, crc::UInt32)
    c = crc
    for byte in data 
        c = crc_table[((c &#x22BB; UInt8(byte)) &amp; 0xff) + 1] &#x22BB; (c &gt;&gt; 8)
    end
    return c
end
</code></pre><p>With some simple multi-dispatch, we can even add a version that takes Julia strings:</p><pre><code class="language-julia">crc32(data::AbstractString, crc::UInt32) = crc32(Vector{UInt8}(data), crc)</code></pre><p>Finally, we can now apply this to our PNG chunk. The PNG spec mentions that the CRC is computed over the bytes from the type and data field and not the length field. Since we&apos;re still dealing with <code>Vector{UInt8}</code> we can use <code>vcat</code> to concatenate those two fields. </p><p>Additionally, the PNG spec demands that we initialize the CRC with 0xFFFFFFFF and XOR the result again afterward. We can implement a <code>crc32</code> function for our <code>PNGChunk</code> type as follows:</p><pre><code class="language-julia">crc32(c::PNGChunk) = crc32(vcat(c.type, c.data), 0xFFFFFFFF) &#x22BB; 0xFFFFFFFF</code></pre><p>We can now validate the crc32 for each chunk against the CRC we read from the file. First, we need to fix the byte order and ensure we&apos;re comparing values of the same type. The CRC in our PNGChunk is represented as a <code>Vector{UInt8}</code> with the least significant byte coming first. We can reverse the order using <code>reverse</code> and turn the vector into a single <code>UInt32</code> by using <code>reinterpret</code> again.</p><pre><code class="language-julia">isvalid(c::PNGChunk) = crc32(c) == reinterpret(UInt32, reverse(c.crc))[1]</code></pre><p>I&apos;ve modified our <code>Base.show</code> function from last time to output the data a bit more clearly, and it now looks like this:</p><pre><code class="language-julia">function Base.show(io::IO, c::PNGChunk)
    println(io, &quot;Length: &quot;, length(c))
    println(io, &quot;Type: &quot;,  type(c))
    println(io, &quot;Data: &quot;, datastr(c))
    println(io, &quot;CRC: &quot;, crc32(c), &quot;\t&quot;, isvalid(c) ? &quot;OK&quot; : &quot;INVALID&quot;)
    println(io, &quot;-----&quot;)
end
</code></pre><p>That&apos;s it - we have successfully implemented a basic CRC32 algorithm in Julia. </p><p>Julia comes with a core package called CRC32c using the same algorithm but uses a different generator polynomial, so we, unfortunately, can&apos;t use it. There is, however an excellent third-party library called <a href="https://github.com/andrewcooke/CRC.jl?ref=geekmonkey.org">CRC.jl</a> that you might want to check out.</p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Learn With Me: Julia - Structs and Binary I/O (#3)]]></title><description><![CDATA[<p>Diagrams.net (formerly draw.io) is a fantastic website and tool that allows you to create rich diagrams. The service is entirely free and diagrams can be saved to your Google Drive, Dropbox, or downloaded to your computer. Additionally, diagrams.net allows you to export your diagrams to various formats</p>]]></description><link>https://geekmonkey.org/learn-with-me-julia-structs-and-binary-i-o-3/</link><guid isPermaLink="false">60aaaad1e50698003b73279f</guid><category><![CDATA[Learn With Me: Julia]]></category><category><![CDATA[Julia]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Tue, 01 Jun 2021 14:15:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1573867607131-872f83689352?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDJ8fGRpYWdyYW18ZW58MHx8fHwxNjIyMjEyMzA3&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1573867607131-872f83689352?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDJ8fGRpYWdyYW18ZW58MHx8fHwxNjIyMjEyMzA3&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Learn With Me: Julia - Structs and Binary I/O (#3)"><p>Diagrams.net (formerly draw.io) is a fantastic website and tool that allows you to create rich diagrams. The service is entirely free and diagrams can be saved to your Google Drive, Dropbox, or downloaded to your computer. Additionally, diagrams.net allows you to export your diagrams to various formats such as SVG, JPG, and PNG.</p><p>Recently it was pointed out to me that you can actually load an exported PNG diagram back into the tool and edit it again. This got me thinking - how are they doing this? Surely they aren&apos;t using image recognition techniques to identify objects in the image?</p><p>You may wonder: What does all of this have to do with the title of this post? Let&apos;s talk about the PNG format a little.</p><h2 id="png-file-format">PNG File Format</h2><p>PNG stands for Portable Network Graphics. The format has existed in some form since the mid-nineties. Like all binary file formats, it follows a specification. From the <a href="http://www.libpng.org/pub/png/spec/1.2/PNG-Contents.html?ref=geekmonkey.org">specification</a>, we can learn a lot about how the image and its metadata is represented on disk.</p><h3 id="file-header">File Header</h3><p>A PNG file starts with an <a href="http://www.libpng.org/pub/png/spec/1.2/PNG-Rationale.html?ref=geekmonkey.org#R.PNG-file-signature">8-byte signature</a>. This signature tells a decoder that all bytes that follow are to be interpreted based on the PNG spec. In hexadecimal representation the header is: <code>89 50 4e 47 0d 0a 1a 0a</code></p><h3 id="chunks">Chunks</h3><p>The remainder of the PNG format follows a very simple structure. Data is represented in chunks. Each chunk starts with 4 bytes describing the <strong>length</strong> of <strong>chunk data.</strong> Then follow 4 bytes for the <strong>chunk type</strong>. This again is followed by <em>length</em> bytes of chunk data and finally 4 more bytes for a CRC (cyclic-redundancy check). The CRC can be computed over the chunk type and chunk data.</p><!--kg-card-begin: markdown--><table>
<thead>
<tr>
<th>Length</th>
<th>Chunk type</th>
<th>Chunk data</th>
<th>CRC</th>
</tr>
</thead>
<tbody>
<tr>
<td>4 bytes</td>
<td>4 bytes</td>
<td><em>Length</em> bytes</td>
<td>4 bytes</td>
</tr>
</tbody>
</table>
<!--kg-card-end: markdown--><p>The file specification mentions that while the length is represented using 4 bytes or 32bits the maximum length of chunk data is actually 2^31.</p><p>The chunk types are more interesting as there are plenty of them. I won&apos;t go into much detail here and instead only cover the relevant bits for this post. I encourage you to go read the specification for yourself to understand the nifty encoding techniques used here.</p><p>Since the chunk type is represented by 4 bytes, they can (mostly) be represented using 4 ASCII characters. Chunk types are split into critical and ancillary chunks - a decoder must understand all critical chunks but can safely ignore the ancillary chunks.</p><p>The critical chunks are as follows:</p><ul><li><code>IHDR</code> must be the first chunk in the file. It contains in specific order the image width, height, bit depth, color type, compression method, filter method and interlace method.</li><li><code>PLTE</code> contains information about the color palette used</li><li><code>IDAT</code> contains the actual image. There can be multiple <code>IDAT</code> chunks which is what allows PNG to be a streamable format in which the first smaller IDAT chunk allows a pre-render of the full image before all data is received.</li><li><code>IEND</code> marks the end of the file</li></ul><p>A selection of ancillary chunks:</p><ul><li><code>tIME</code> stores the time the image was last changed</li><li><code>tEXt</code> stores key-value metadata. The text is encoded in ISO 8859-1. The key must be between 1 and 79 characters long and is terminated by a null character. The remainder of chunk data is the value.</li></ul><h2 id="exporting-a-diagram-from-diagramsnet">Exporting a diagram from diagrams.net</h2><p>Before we get started with writing some Julia code, let&apos;s first export a PNG file from diagrams.net. </p><p>This is fairly straightforward, just head over to diagrams.net, click together a diagram and hit File &gt; Export and choose PNG. Make sure to keep the &quot;Include a copy of my diagram&quot; checkbox checked.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/05/Diagram.png" class="kg-image" alt="Learn With Me: Julia - Structs and Binary I/O (#3)" loading="lazy" width="121" height="201"><figcaption>Sample diagram</figcaption></figure><h2 id="file-io-in-julia">File IO in Julia</h2><p>With everything prepared we can start looking into I/O. We&apos;re not going to do anything advanced here so we&apos;ll just look at the basics.</p><p>Interacting with files, regardless of the language, always follows the same pattern:</p><ul><li>Open a file for reading/writing</li><li>Read/write</li><li>Close the file descriptor</li></ul><p>Julia is no exception to this. We can use <a href="https://docs.julialang.org/en/v1/base/io-network/?ref=geekmonkey.org#Base.open">Base.open</a> to open a file. This will give us an IOStream instance which in turn wraps the OS file descriptor. We can either do it in a block, in which case the file will be closed automatically at the end of the block, or we call open/close separately.</p><pre><code class="language-julia">open(&quot;myfile.txt&quot;, &quot;r&quot;) do io
   # ...
end;</code></pre><p>Furthermore, there are multiple ways to read data from a file. </p><p>We&apos;ll need <code>read</code> and <code>readbytes!</code>. They both take an <code>IOStream</code> (the result of the open call) as the first argument. <code>read</code> takes a primitive type as a second argument telling it to read a single value of that type from the <code>IO</code> and return it. I.e. <code>read(io, UInt32)</code> will read the 4 bytes it takes to represent a UInt32.</p><p><code>readbytes!</code> requires a vector-like object to be passed as its second argument. It will read as many bytes as the vector can hold as long as there&apos;s data to read.</p><h3 id="reading-in-the-png-file">Reading in the PNG file</h3><p>Let&apos;s put what we&apos;ve just learned together. Here&apos;s the plan:</p><ul><li>Open the PNG file</li><li>Check for the file header (remember those 8 bytes mentioned above?)</li><li>Read in PNG chunks by first consuming the length, the type, the data based on the length field and finally the CRC.</li></ul><p>We can represent PNG chunks using a struct with named fields for each of the elements. The easiest way to represent a sequence of bytes is using a <code>Vector{UInt8}</code>. Here&apos;s the struct I came up with:</p><pre><code class="language-julia">struct PNGChunk
    length::UInt32
    type::Vector{UInt8}
    data::Vector{UInt8}
    crc::Vector{UInt8}
end</code></pre><p>It&apos;s also useful to declare a constant for holding the PNG header:</p><pre><code class="language-julia">const PNG_HEADER = [0x89, 0x50, 0x4e, 0x47, 0x0d, 0x0a, 0x1a, 0x0a]
</code></pre><p>Let&apos;s now open the PNG file and read in the first 8 bytes for the header:</p><pre><code class="language-julia">io = open(&quot;Diagram.png&quot;, &quot;r&quot;)
header = Vector{UInt8}(undef, 8)
readbytes!(io, header)</code></pre><p><code>readbytes!</code> takes an <code>IOStream</code> handle and a variable that it will try to fill. You can pass it an additional integer to indicate the number of bytes to read but it defaults to the length of the second argument which we&apos;ve declared as a vector of <code>UInt8</code>s with 8 elements.</p><p>By simply comparing <code>header</code> with <code>PNG_HEADER</code> we can determine whether we&apos;re dealing with a valid PNG file:</p><pre><code class="language-julia">if header &#x2260; PNG_HEADER
    throw(ArgumentError(&quot;File is not a PNG&quot;))
end
</code></pre><p>Assuming our file is valid we can now attempt to read in all the chunks in the file. It&apos;s easiest to do this iteratively with a loop and consume the file until we hit EOF. Luckily Julia provides an <code>eof</code> function that takes an <code>IOStream</code> and returns whether or not we&apos;ve reached the end of the file.</p><pre><code class="language-julia">while !eof(io)
    length = hton(read(io, UInt32))

    type = Vector{UInt8}(undef, 4)
    readbytes!(io, type)

    data = Vector{UInt8}(undef, length)
    readbytes!(io, data)

    crc = Vector{UInt8}(undef, 4)
    readbytes!(io, crc)

    push!(chunks, PNGChunk(length, type, data, crc))
end
</code></pre><p>I&apos;m calling <code>hton</code> to get the length represented properly. This is because my system (Intel-based MacBook Pro) is a little-endian system (meaning the least significant byte comes first) but PNG represents all data in big-endian requiring us to reorder bytes. </p><p>The loop will continue to consume bytes for the chunk type, data, and the CRC and construct a PNGChunk that will then be pushed into a vector. </p><p><strong>Note:</strong> The above code will work for a valid PNG file. There&apos;s no error checking at all so if one of the fields is corrupted or the file ends prematurely this will throw an error and fail.</p><h3 id="displaying-chunks">Displaying chunks</h3><p>Now that we&apos;re done reading the file we should take a look at its contents. For this, we can add a bunch of helper functions. </p><p>We essentially want to run something like:</p><pre><code class="language-julia">for chunk in chunks
    print(chunk)
end</code></pre><p>but executing this will result in a lot of gibberish being displayed. To tell Julia how to display a <code>PNGChunk</code> we need to implement <code>Base.show</code> for our type. <code>Base.show</code> takes an <code>IO</code> object and an instance of a type. You can compare this with <code>__repr__</code> in Python. An implementation that will display the length and the type of a chunk might look as follows:</p><pre><code class="language-julia">function Base.show(io::IO, c::PNGChunk)
    println(io, length(c), &quot;\t&quot;, type(c))
end
</code></pre><p>Where in other languages you declare methods on classes, in Julia you simply declare a function that operates on a type. To make the implementation of <code>Base.show</code> work we need to define length and type:</p><pre><code class="language-julia">length(c::PNGChunk) = c.length
type(c::PNGChunk) = String(Char.(c.type))
</code></pre><p>While we could simply access <code>chunk.length</code> directly it&apos;s common practice to consider struct fields &quot;private&quot; and write functions to access them. This way you get a layer of abstraction and can easily change the layout of structs without breaking code all over the place.</p><p>To deconstruct what&apos;s going on in the second line let&apos;s start by looking at <code>c.type</code>. We declared the type to be a <code>Vector{UInt8}</code> and we consumed 4 bytes while reading the PNG file. The first thing we want to do is convert each item in the vector to its ASCII character representation. Julia provides the <code>Char</code> data type to represent 32-bit characters. Simply calling <code>Char(c.type)</code> would result in Julia attempting to consume all 4 bytes (32 bit) and won&apos;t give us the desired result.</p><p>Instead, we can iterate over the items in the vector and convert each item to a <code>Char</code>. This could be written using a list comprehension like <code>[Char(ch) for ch in c.type]</code> which is rather lengthy but standard if you&apos;re coming from Python. Julia conveniently offers the dot-operator (also called <em>broadcast</em>) which can be applied to any function. By writing <code>Char.(c.type)</code> we&apos;re essentially expressing &quot;apply each element in c.type to the Char function&quot;. </p><p>Finally, we wanted to obtain the string representation of those characters and by passing a <code>Vector{Char}</code> to the String function we can cast it into a string. </p><p>More tenured Julia developers would probably write all of the above simply as <code>collect(Char, c.type) |&gt; join</code>, but we&apos;re going to ignore this for now.</p><p>Ok, back to displaying the chunk. With <code>Base.show</code> and our two functions out of the way we can loop over the chunks and see what&apos;s inside our file:</p><pre><code>13      IHDR
970     tEXt
3379    IDAT
0       IEND
</code></pre><p>So that&apos;s cool - we&apos;ve got three chunks with data. IHDR contains height, width, color depth and some other metadata about the file and IDAT contains the actual image. This leaves <code>tEXt</code> which could contain anything. </p><h3 id="extracting-information-from-ihdr">Extracting information from IHDR</h3><p>Let&apos;s see if we can make sense of the data in the IHDR chunk. First we&apos;re going to modify our <code>Base.show</code> implementation to also display the data field when we recognize the chunk type.</p><pre><code class="language-julia">function Base.show(io::IO, c::PNGChunk)
    println(io, length(c), &quot;\t&quot;, type(c) ,&quot;\t&quot;, datastr(c))
end
</code></pre><p>The specification tells us that there are 13 bytes reserved for the IHDR data field and how many bytes are reserved for different properties.</p><pre><code>The IHDR chunk must appear FIRST. It contains:

   Width:              4 bytes
   Height:             4 bytes
   Bit depth:          1 byte
   Color type:         1 byte
   Compression method: 1 byte
   Filter method:      1 byte
   Interlace method:   1 byte</code></pre><p>The multi-byte fields will require endian conversion. Since we have already read in all data we need to reinterpret the data from our <code>Vector{UInt8}</code>. That&apos;s exactly the name of a Julia function that helps with reinterpreting data into another type:</p><pre><code class="language-julia">hton(reinterpret(UInt32, c.data[1:4])[1])</code></pre><p>This will take the first four bytes of chunk data and reinterpret them into a UInt32. The wrapping <code>hton</code> will make sure to convert from host byte order to big endian. We can repeat this for the height field and then read all the individual bytes.</p><pre><code class="language-julia">function datastr(c::PNGChunk)
    if type(c) == &quot;IHDR&quot;
        height = hton(reinterpret(UInt32, c.data[1:4])[1])
        width = hton(reinterpret(UInt32, c.data[5:8])[1])
        depth, ct, cm, fm, im = c.data[9:13]
        return &quot;h=$height, w=$width, d=$depth, color type=$ct, compression method=$cm, filter method=$fm, interlace method=$im&quot;
    end
    &quot;&quot;
end
</code></pre><p>For my diagram I get the following output:</p><pre><code>h=121, w=201, d=8, color type=6, compression method=0, filter method=0, interlace method=0</code></pre><h3 id="obtaining-the-original-diagram-from-text">Obtaining the original diagram from tEXt</h3><p>Finally, let&apos;s peek inside the <code>tEXt</code> chunk. We can first extend our <code>datastr(c::PNGChunk)</code> function to also have a branch to catch the <code>tEXt</code> type and simply print the contents of the data field:</p><pre><code> mxfile %3Cmxfile%20host%3D%22app.diagrams.net%22%20modified%3D%222021-05-24T09%3A22%3A42.489Z%22%20agent%3D%225.0%20
(Macintosh%3B%20Intel%20Mac%20OS%20X%2010_15_7)....</code></pre><p>That&apos;s a bunch of gibberish. Consulting the specification tells us that the data field for <code>tEXt</code> consists of a key and value pair separated by a null-byte. That should be easy to parse:</p><pre><code class="language-julia">key, value = split(String(Char.(c.data)), &apos;\0&apos;)
</code></pre><p>But that&apos;s only half the equation. It looks like the value part may be URL encoded and so we need to find a way to decode it. I couldn&apos;t find this functionality in the standard library and so I ended up installing URLParser.jl which implements <code>unescape</code>.</p><pre><code>(@v1.6) pkg&gt; add URLParser</code></pre><p>Putting everything together we can complete our <code>datastr</code> function by adding <code>tEXt</code> handling:</p><pre><code class="language-julia">    elseif type(c) == &quot;tEXt&quot;
        key, value = split(String(Char.(c.data)), &apos;\0&apos;)
        value = unescape(value)
        return &quot;$key, $value&quot;
    end</code></pre><p>And so the final output is:</p><pre><code>13      IHDR    h=121, w=201, d=8, color type=6, compression method=0, filter method=0, interlace method=0
970     tEXt    mxfile, &lt;mxfile host=&quot;app.diagrams.net&quot; modified=&quot;2021-05-24T09:22:42.489Z&quot; agent=&quot;5.0 (Maci
ntosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36&quot; et
ag=&quot;MSmUq0enpJxQ3pDGyP_L&quot; version=&quot;14.3.0&quot;&gt;&lt;diagram id=&quot;py8BCTe_me7SkJGnhe6H&quot; name=&quot;Page-1&quot;&gt;zZRNb4MwDEB/DcdJ
EDbaHruOdYdNm9TDdo2IC5kCRsF89dcvjFCKWKvtUGmXijw7dfxwcPxN2mw1z5MXFKAc5orG8R8cxpZLZn470FqwcHsQayl65I1gJw9g4ZBW
SgHFJJEQFcl8CiPMMohowrjWWE/T9qimVXMewwzsIq7m9F0KSiz1gtUYeAIZJ7b0ki36QMqHZNtJkXCB9QnyQ8ffaETqn9JmA6pzN3jp9z2e
iR4PpiGj32wItquqWN+tgjC+3fPXj0Ks4cbv/6XiqrQN28NSOxgAYYTYJWpKMMaMq3Ck9xrLTEBXxjWrMecZMTfQM/ATiFr7dnlJaFBCqbLR
vmZX6GxvFhVY6gguNDTMCNcx0IU8dnwDZnIBUyDdmn0aFCdZTc/B7QzFx7xRs3mwpv9g3ZtZX6f8ILN4Jn9U23mqE0mwy/m3gdrct580VqAJ
mssi543bDcy102qvqzdc1/pk+IeJTk7mPnCv5IrNXL0ppOLfmfK965kyy/E78R07+dj64Rc=&lt;/diagram&gt;&lt;/mxfile&gt;
3379    IDAT
0       IEND
</code></pre><p>The secret to how diagrams.net embeds the diagram is solved. It&apos;s urlencoded XML embedded into a <code>tEXt</code> chunk inside the PNG file (now that&apos;s a fun sentence to say!).</p><p>The full code can be found at <a href="https://github.com/halfdan/geekmonkey/tree/main/julia/lwm-03?ref=geekmonkey.org">https://github.com/halfdan/geekmonkey/tree/main/julia/lwm-03</a></p><h3 id="summary">Summary</h3><p>In this article, we&apos;ve covered a lot of different concepts in Julia. If you struggled to keep up - don&apos;t worry I&apos;ll go over all the concepts mentioned here in more detail in future posts. My approach to learning is often guided by the projects I want to do and so I often jump in at the deep end. As a result, this article introduced concepts rather rapidly without spending too much time on the mechanics.</p><p>It&apos;s always fascinating when you think about how many things we take for granted in tech without thinking about the underlying mechanics. I was definitely surprised by how easy it was to extract some metadata from a binary format like PNG. I&apos;ve used PNG files for decades without ever thinking about their inner structure. Clearly, we&apos;ve only scratched the surface and haven&apos;t looked at the IDAT chunk containing all the image information, but we&apos;ll get there with time. </p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Learn With Me: Julia - Tools and Learning Resources (#2)]]></title><description><![CDATA[A short collection of tools and resources that help experienced developers learn Julia.]]></description><link>https://geekmonkey.org/lwm-julia-2-tools-and-learning-resources/</link><guid isPermaLink="false">60a11c2381a039004b6c5835</guid><category><![CDATA[Learn With Me: Julia]]></category><category><![CDATA[Julia]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Tue, 18 May 2021 15:00:00 GMT</pubDate><media:content url="https://geekmonkey.org/content/images/2021/05/lwm-julia-logo-2.png" medium="image"/><content:encoded><![CDATA[<img src="https://geekmonkey.org/content/images/2021/05/lwm-julia-logo-2.png" alt="Learn With Me: Julia - Tools and Learning Resources (#2)"><p>Before we really dive into Julia I wanted to go over the tools and learning resources I have and will be using going forward. These resources fit my learning journey and may not directly apply to you so I encourage you to spend some time to see what&apos;s out there.</p><h2 id="learning-resources">Learning Resources</h2><p>I have compiled a list of learning resources that I intend to explore over the next 6 months as I progress in the Julia language. Any learning journey is prone to fail if you don&apos;t work towards an end goal. As stated in my previous post I want to use Julia to get into data science and machine learning and so it should come as no surprise that my selection of learning resources is biased toward these topics.</p><p>It was difficult to settle on a good book since the majority of books listed on the <a href="https://julialang.org/learning/books/?ref=geekmonkey.org">Julia website</a> seem to be targeted towards programming beginners and seem to be lacking the depth I am looking for. Some other books have terrible ratings and so I excluded them. I ended up including only one book in the list below.</p><h3 id="basic">Basic</h3><p>The following resources should help get a basic understanding of programming in Julia and provide a good set of problems and challenges to practice learned skills.</p><ul><li>The Julia track on <a href="https://www.coursera.org/learn/julia-programming?ref=geekmonkey.org">exercism.io</a> where you can get your coding solutions mentored by volunteers. This provides plenty of exercises to apply some of the basics and get high quality feedback.</li><li>The <a href="https://syl1.gitbook.io/julia-language-a-concise-tutorial/?ref=geekmonkey.org">Julia Language - A concise tutorial</a> gitbook, as the name suggests, gives a concise overview of several important topics such as I/O, data structures, meta programming, package development and concurrency. </li><li><a href="https://juliaacademy.com/courses?ref=geekmonkey.org">Julia Academy</a> provides a number of interesting courses.</li><li>Finally, the official <a href="https://docs.julialang.org/en/v1/?ref=geekmonkey.org">Julia documentati<a href="https://docs.julialang.org/en/v1/?ref=geekmonkey.org">o</a>n</a> will always be my first go-to resource should I get stuck in a problem. As a shorthand for browsing the documentation I&apos;ll resort to using Julia&apos;s built-in <a href="https://geekmonkey.org/getting-started-with-julia-lang/#repl">help mode in the <a href="https://geekmonkey.org/getting-started-with-julia-lang/#repl">REPL</a></a>.</li></ul><h3 id="intermediateadvanced">Intermediate/Advanced</h3><p>Eventually I&apos;d like to dive a lot deeper into Julia performance and how to squeeze the best performance out of my code. </p><ul><li>The MIT course <a href="https://computationalthinking.mit.edu/Spring21/?ref=geekmonkey.org">Introduction to Computational Thinking</a> is an incredibly well received course that uses Julia and Pluto.jl. The course teaches image analysis, machine learning, climate modelling and dynamics on networks.</li><li>The only book that stood out to me is <a href="https://www.packtpub.com/product/julia-high-performance-second-edition/9781788298117?ref=geekmonkey.org">Julia High Performance - Second Edition</a> by Avik Sengupta, Alan Edelman. Just judging from its cover it seems to go into a lot more depth than most books about Julia out there.</li></ul><h2 id="tools">Tools</h2><p>When it comes to tools, there really isn&apos;t that much needed for Julia other than a decent computer that runs any of the three main OSes. Most of my development these days happens on Mac OS, but I&apos;m also exploring developing on a more powerful <a href="https://geekmonkey.org/developer-pc-build-2021/">Windows machine</a> with WSL2.</p><p>For now, I&apos;m going to be using:</p><ul><li>Visual Studio Code with the <a href="https://www.julia-vscode.org/?ref=geekmonkey.org">Julia for VSCode</a> extension. It&apos;s definitely not a requirement to use this particular editor or even the extension but it&apos;s an editor I&apos;m already very comfortable with.</li><li><a href="https://github.com/fonsp/Pluto.jl?ref=geekmonkey.org">Pluto.jl</a> for interactive notebooks and quick data exploration</li></ul><p>That&apos;s really all that&apos;s needed to get started. I&apos;m sure my development workflow and tools will evolve over time but for now </p><p>If you are interested in seeing a different setup using neovim and tmux I suggest you check out Jacob Zelko&apos;s excellent post on his <a href="http://jacobzelko.com/workflow/?ref=geekmonkey.org">writing and coding workflow</a>.</p><h2 id="project-driven-learning">Project-driven Learning</h2><p>My method of learning a new (programming) language has evolved over time. Immediately applying what you learn is essential for retention. As I read through my resources and explore Julia packages I will develop small project ideas. These projects aren&apos;t necessarily meant to evolve into production-grade software, but rather produce something interesting or help me explore an area of technology.</p><p>Not all of my time will be spent on larger projects, however. There&apos;s value in exploring small coding challenges from exercism.io, <a href="https://projecteuler.net/?ref=geekmonkey.org">projecteuler</a> and others when you&apos;re short on time or can&apos;t make any progress in your project.</p><p>As I dive deeper into Julia and get more comfortable navigating the package ecosystem I expect the focus of my projects to also shift and my projects to grow in size.</p><p>I intend to document most of these projects and invite you to follow along as I learn this exciting language and discover its secrets.</p><div class="kg-card kg-callout-card kg-callout-card-blue"><div class="kg-callout-emoji">&#x1F4A1;</div><div class="kg-callout-text">If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning, and other tech.<br><br>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</div></div><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[Learn With Me: Julia - Introduction (#1)]]></title><description><![CDATA[<p>Welcome to <em>Learn With Me: Julia</em>. A series where you can follow me along my journey of learning Julia, Data Science and Machine Learning. This series is heavily inspired by <a href="https://inquisitivedeveloper.com/lwm-elixir-1/?ref=geekmonkey.org">Learn With Me: Elixir</a>, a series by Kevin Peter / The Inquisitive Developer and the format of this post will follow</p>]]></description><link>https://geekmonkey.org/lwm-julia-1/</link><guid isPermaLink="false">609cf08681a039004b6c5768</guid><category><![CDATA[Learn With Me: Julia]]></category><category><![CDATA[Julia]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Thu, 13 May 2021 15:00:00 GMT</pubDate><media:content url="https://geekmonkey.org/content/images/2021/05/lwm-julia-logo.png" medium="image"/><content:encoded><![CDATA[<img src="https://geekmonkey.org/content/images/2021/05/lwm-julia-logo.png" alt="Learn With Me: Julia - Introduction (#1)"><p>Welcome to <em>Learn With Me: Julia</em>. A series where you can follow me along my journey of learning Julia, Data Science and Machine Learning. This series is heavily inspired by <a href="https://inquisitivedeveloper.com/lwm-elixir-1/?ref=geekmonkey.org">Learn With Me: Elixir</a>, a series by Kevin Peter / The Inquisitive Developer and the format of this post will follow his introductory post for Elixir. </p><p>I realised that while there are plenty of resources about Julia already out there, it would be interesting to document my journey in picking up the language and some fundamental data science and machine learning with it.</p><p>The Julia community, to a large degree, consists of academics. The level of discourse on the Julia Slack / Zulip is often too advanced for me to understand. Researchers from all kinds of fields, space engineering, bio engineering, mathematics all come together to practice Julia. The 2020 Community Survey nicely shows this:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/05/image-4.png" class="kg-image" alt="Learn With Me: Julia - Introduction (#1)" loading="lazy" width="1290" height="622" srcset="https://geekmonkey.org/content/images/size/w600/2021/05/image-4.png 600w, https://geekmonkey.org/content/images/size/w1000/2021/05/image-4.png 1000w, https://geekmonkey.org/content/images/2021/05/image-4.png 1290w" sizes="(min-width: 720px) 720px"><figcaption>Slide from the 2020 <a href="https://julialang.org/assets/2020-julia-user-developer-survey.pdf?ref=geekmonkey.org">Julia Community Survey</a></figcaption></figure><h3 id="about-you">About You</h3><p>I&apos;m going to be writing this series for someone who has some programming knowledge and like me wants to learn about Julia. Some familiarity with computer science and algorithms will be useful and you should also have an interest in mathematics as I like to lean into the mathematics of machine learning when I&apos;m ready to explore it with Julia.</p><p>For the rest of this section I&apos;ll quote Kevin Peter, since it also applies here:</p><blockquote>So while this series is not meant for beginning programmers, you don&apos;t have to be a master programmer to follow along either. I very much doubt I will be delving into any advanced theoretical concepts or heavy mathematics. I&apos;m aiming for practical stuff that a typical experienced software developer will be able to read and understand. I aim to be easily readable and informative.</blockquote><p>If you need a resource on how to get started with Julia and a quick overview of why I chose this language you can read my <a href="https://geekmonkey.org/getting-started-with-julia-lang/">Getting started with Julia</a> post.</p><h3 id="about-me">About Me</h3><p>I left academia six years ago and have since been working with Ruby, JavaScript and Python in a professional setting almost exclusively. In my job I build web application backends and ETL pipelines and work closely with data scientists. </p><p>My personal interest in Machine Learning is what&apos;s driving me to Julia. Its expressivity over other languages like Python intrigues me and makes me think that it&apos;s only going to grow going forward.</p><p>Recently I&apos;ve <a href="https://geekmonkey.org/100daysofcode-julia-edition/">challenged myself</a> to practice Julia 45 minutes every (week)day as part of a #100daysofcode challenge. I&apos;m 25 days into this challenge and have explored the popular libraries such as Pluto, Plots, Revise and Javis. </p><p>You can find more information about who I am on the <a href="https://geekmonkey.org/about/">About</a> page</p>]]></content:encoded></item><item><title><![CDATA[Julia GPU Programming with WSL2]]></title><description><![CDATA[A walkthrough on how to get started with Julia GPU programming under WSL2 on Windows.]]></description><link>https://geekmonkey.org/julia-gpu-programming-with-wsl2/</link><guid isPermaLink="false">60844ee5e9bb85003b285c78</guid><category><![CDATA[Julia]]></category><category><![CDATA[CUDA]]></category><category><![CDATA[GPU Programming]]></category><category><![CDATA[WSL2]]></category><category><![CDATA[Ubuntu]]></category><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Wed, 05 May 2021 13:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1604361060556-5de6bf0b4163?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fGdwdXxlbnwwfHx8fDE2MjAyMTQ2MDY&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1604361060556-5de6bf0b4163?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=MnwxMTc3M3wwfDF8c2VhcmNofDV8fGdwdXxlbnwwfHx8fDE2MjAyMTQ2MDY&amp;ixlib=rb-1.2.1&amp;q=80&amp;w=2000" alt="Julia GPU Programming with WSL2"><p>I use Windows for gaming. It&apos;s been a long time since I&apos;ve last done any serious development work on my Windows machine and yet I still spent a good chunk of money on building out a <a href="https://geekmonkey.org/developer-pc-build-2021/">beefy machine</a> for my efforts to learn machine learning. It has taken me a few months to finally sit down and get this machine ready for anything other than gaming.</p><p>With the <a href="https://ubuntu.com/blog/new-gpu-and-gui-features-announced-for-wsl-at-build?ref=geekmonkey.org">recent announcement</a> of GUI support for WSLg I got really excited to try out WSL and see how good the GPU support actually is, but that&apos;s not the main reason. I&apos;ve been shying away from developing on Windows because I&apos;m used to a *NIX environment. WSL gives you that, but up until recently you wouldn&apos;t have been able to interact with any GPU - and this all changed with this announcement!</p><p>You can watch the video below to see what&apos;s coming for WSL2.</p><figure class="kg-card kg-embed-card"><iframe width="200" height="113" src="https://www.youtube.com/embed/f8_nvJzuaSU?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure><p>The first half of this article will show you how to get everything set up and in the second half we&apos;ll set up <a href="https://juliagpu.gitlab.io/CUDA.jl/?ref=geekmonkey.org">CUDA.jl</a> in Julia. Since the latter part is about CUDA I&apos;ll assume that you have a compatible nVidia GPU.</p><h2 id="installation">Installation</h2><p>Here&apos;s a summary of what we need to go through to prepare our environment:</p><ul><li>Update Windows 10 to latest release on the dev channel</li><li>Install nVidia CUDA drivers</li><li>Install Ubuntu 20.04 in WSL2</li><li>Install Linux CUDA packages</li><li>&#x1F389;</li></ul><h3 id="windows-10-insider-preview">Windows 10 Insider Preview</h3><p>At the time of writing all of the features are only available through the <a href="https://insider.windows.com/en-us/?ref=geekmonkey.org">Windows Insider Program</a>. The Windows Insider Program allows you to receive new Windows features before the hit the main update line. The program is split into three channels: <strong>Dev</strong>, <strong>Beta</strong> and <strong>Release Preview</strong>.</p><p>To receive the update with WSLg and GPU support we will need to switch to the dev channel.</p><figure class="kg-card kg-image-card"><img src="https://geekmonkey.org/content/images/2021/05/image-1.png" class="kg-image" alt="Julia GPU Programming with WSL2" loading="lazy" width="708" height="579" srcset="https://geekmonkey.org/content/images/size/w600/2021/05/image-1.png 600w, https://geekmonkey.org/content/images/2021/05/image-1.png 708w"></figure><p><strong>Note:</strong> The dev channel comes with some rough edges and potential for system instability. Be mindful of this when you switch and make sure you have backups!</p><p>After installing all downloaded updates you should end up with <a href="https://blogs.windows.com/windows-insider/2021/04/21/announcing-windows-10-insider-preview-build-21364/?ref=geekmonkey.org">OS Build 21364</a> or higher. You can check your OS Build by running <code>winver</code> in PowerShell/cmd.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-1.png" class="kg-image" alt="Julia GPU Programming with WSL2" loading="lazy" width="460" height="423"><figcaption>Run <code>winver</code> using the Windows Run command</figcaption></figure><p>With this all set we can hop on to install the latest WSL2 compatible CUDA drivers.</p><h3 id="cuda-drivers">CUDA drivers</h3><p>NVIDIA are providing special CUDA drivers for Windows 10 WSL. The link below will take you to the download page. It&apos;s required to sign up for the NVIDIA Developer Program, which is free.</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://developer.nvidia.com/cuda/wsl?ref=geekmonkey.org"><div class="kg-bookmark-content"><div class="kg-bookmark-title">GPU in Windows Subsystem for Linux (WSL)</div><div class="kg-bookmark-description">CUDA on Windows Subsystem for Linux (WSL) - Public Preview Microsoft Windows is a ubiquitous platform for enterprise, business, and personal computing systems. However, industry AI tools, models, frameworks, and libraries are predominantly available on Linux OS. Now all users of AI - whether they ar&#x2026;</div><div class="kg-bookmark-metadata"><img class="kg-bookmark-icon" src="https://developer.nvidia.com/sites/all/themes/devzone_base/favicon.ico" alt="Julia GPU Programming with WSL2"><span class="kg-bookmark-author">NVIDIA Developer</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://developer.nvidia.com/sites/default/files/akamai/cuda/images/WSL-launch-stack-diagram-HR-r4.png" alt="Julia GPU Programming with WSL2"></div></a></figure><p>Follow the setup wizard (I chose the express installation which keeps existing settings in place).</p><p><strong>Note:</strong> There&apos;s also a documentation page provided by NVIDIA around setting up your GPU for WSL. I found the docs to be outdated and not working on my machine.</p><h3 id="installing-ubuntu-2004-lts-with-wsl2">Installing Ubuntu 20.04 LTS with WSL2</h3><p>Before we can proceed with installing Ubuntu I advise to update the WSL kernel by running:</p><pre><code class="language-sh">wsl --update</code></pre><p>In case you&apos;re like me and don&apos;t enjoy the default terminal Windows comes with I suggest you install <a href="https://www.microsoft.com/en-us/p/windows-terminal/9n0dx20hk701?activetab=pivot%3Aoverviewtab&amp;ref=geekmonkey.org">Windows Terminal</a> from the Microsoft Store. This terminal comes is a lot more pleasant to use than either cmd or the PowerShell terminal ever were.</p><p>It&apos;s also a good idea to set WSL to default to version 2:</p><pre><code class="language-sh">wsl --set-default-version 2</code></pre><p>Finally let&apos;s install Ubuntu with:</p><pre><code class="language-sh">wsl --install --distribution Ubuntu-20.04</code></pre><p>In case you were wondering what other distributions are available you can simply run: <code>wsl --list --online</code>.</p><h3 id="ubuntu-2004-lts">Ubuntu 20.04 LTS</h3><p>With Ubuntu installed we have a couple of final steps. First we will add an upstream repo to apt for getting the latest CUDA builds directly from NVIDIA:</p><figure class="kg-card kg-code-card"><pre><code class="language-sh">sudo add-apt-repository &quot;deb https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/ /&quot;</code></pre><figcaption>Add CUDA apt repository for Ubuntu 20.04</figcaption></figure><p>We also need to add NVIDIA&apos;s GPG key for the apt repo:</p><figure class="kg-card kg-code-card"><pre><code class="language-sh">sudo apt-key adv --fetch-keys https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub </code></pre><figcaption>Add nVidia gpg key</figcaption></figure><p>And finally to make sure that we prefer the packages provided by NVIDIA over packages in mainline Ubuntu we need to pin the apt repo:</p><figure class="kg-card kg-code-card"><pre><code class="language-sh">wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/cuda-ubuntu2004.pin
sudo mv cuda-ubuntu2004.pin /etc/apt/preferences.d/cuda-repository-pin-600</code></pre><figcaption>Pin file to prioritize CUDA apt repository</figcaption></figure><p>With this out of the way, we&apos;re ready to install the CUDA drivers inside our WSL Ubuntu installation:</p><pre><code class="language-sh"> sudo apt update &amp;&amp; sudo apt install -y cuda-drivers</code></pre><p>In case setting up WSL2 with GPU support was all you wanted to do - we&apos;re done! </p><h2 id="julia-wsl2-cudajl">Julia + WSL2 + Cuda.jl</h2><p>Ok, so as promised in the headline we&apos;re now going to install Julia inside our Ubuntu 20.04 and set up Cuda.jl.</p><h3 id="install-julia">Install Julia</h3><p>At the time of writing Julia 1.6.1 is the latest version available (make sure to check for updates on Julia&apos;s download page).</p><p>First let&apos;s fetch the latest Julia tarball:</p><pre><code class="language-sh">wget https://julialang-s3.julialang.org/bin/linux/x64/1.6/julia-1.6.1-linux-x86_64.tar.gz</code></pre><p>Extract the <code>.tar.gz</code>:</p><pre><code class="language-sh">tar -xvzf julia-1.6.1-linux-x86_64.tar.gz
</code></pre><p>Move the extracted folder to <a href="https://askubuntu.com/a/34922/292615?ref=geekmonkey.org"><code>/opt</code></a>:</p><pre><code class="language-sh">sudo mv -r julia-1.6.1 /opt/
</code></pre><p>Finally, create a symbolic link to <code>julia</code> inside the <code>/usr/local/bin</code> folder:</p><pre><code class="language-sh">sudo ln -s /opt/julia-1.6.1/bin/julia /usr/local/bin/julia</code></pre><p>You may pick a different target directory for your installation of Julia or use a version manager like asdf-vm.</p><h3 id="install-cudajl">Install Cuda.jl</h3><p>At this point simply run <code>julia</code> in your terminal and you should be dropped into the Julia REPL. I assume you&apos;ve worked with Julia before and know how to operate it&apos;s package manager.</p><p>Hit <code>]</code> to enter pkg mode and install CUDA with:</p><pre><code>activate --temp
add CUDA</code></pre><p>From here hit backspace and import CUDA. CUDA.jl provides a useful function called <code>functional</code> which will confirm that we&apos;ve done everything right (well, that&apos;s the hope at least, right?).</p><pre><code class="language-julia">using CUDA
CUDA.functional()</code></pre><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/05/image-3.png" class="kg-image" alt="Julia GPU Programming with WSL2" loading="lazy" width="871" height="510" srcset="https://geekmonkey.org/content/images/size/w600/2021/05/image-3.png 600w, https://geekmonkey.org/content/images/2021/05/image-3.png 871w" sizes="(min-width: 720px) 720px"><figcaption>Ensuring Julia CUDA.jl is functional inside WSL2</figcaption></figure><p>You can additionally run <code>CUDA.versioninfo()</code> to get a more detailed breakdown of the supported features on your GPU.</p><p>At this point you should have a working installation with WSL2, Ubuntu 20.04, Julia and CUDA.jl. In case you&apos;re new to CUDA.jl I suggest you follow the excellent <a href="https://juliagpu.gitlab.io/CUDA.jl/tutorials/introduction/?ref=geekmonkey.org">introduction to GPU programming</a> by JuliaGPU or jump in at the deep end with <a href="https://fluxml.ai/Flux.jl/stable/gpu/?ref=geekmonkey.org">FluxML&apos;s GPU support</a>.</p><hr><p>If you like articles like this one, please consider subscribing to my free newsletter where at least once a week I send out my latest work covering Julia, Python, Machine Learning and other tech.</p><p>You can also follow me on <a href="https://twitter.com/geekproject?ref=geekmonkey.org">Twitter</a>.</p>]]></content:encoded></item><item><title><![CDATA[Developer PC Build 2021]]></title><description><![CDATA[What components are required to build a good developer machine in 2021? In this article I summarize how I built my most recent machine and how I selected its components.]]></description><link>https://geekmonkey.org/developer-pc-build-2021/</link><guid isPermaLink="false">608901c9e9bb85003b285ca9</guid><dc:creator><![CDATA[Fabian Becker]]></dc:creator><pubDate>Wed, 28 Apr 2021 17:19:20 GMT</pubDate><media:content url="https://geekmonkey.org/content/images/2021/04/pc-2020-full-1.jpeg" medium="image"/><content:encoded><![CDATA[<img src="https://geekmonkey.org/content/images/2021/04/pc-2020-full-1.jpeg" alt="Developer PC Build 2021"><p>Some time in 2020 I came to the realisation that my 2013 built gaming computer for getting a bit outdated. With becoming a father and working a full time job I didn&apos;t have as much time as I used to to play games so for the past few years I barely had any use for a proper gaming PC and ended up only casually playing old games like Skyrim or Starcraft 2 which my computer was more than capable of handling.</p><p>With the pandemic came more time at home and at the same time a renewed interest in learning some machine learning and utilizing GPUs - it was time to build a new computer!</p><p>I set out to build a computer that is powerful enough to run new and upcoming games like Cyberpunk 2077 but also serve as a tool for machine learning experiments. It doesn&apos;t need beat all the benchmarks but it should provide me with enough computing power to last me several years.</p><h3 id="my-new-machine">My new machine</h3><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://geekmonkey.org/content/images/2021/04/pc-2020-full.jpeg" width="1896" height="1896" loading="lazy" alt="Developer PC Build 2021" srcset="https://geekmonkey.org/content/images/size/w600/2021/04/pc-2020-full.jpeg 600w, https://geekmonkey.org/content/images/size/w1000/2021/04/pc-2020-full.jpeg 1000w, https://geekmonkey.org/content/images/size/w1600/2021/04/pc-2020-full.jpeg 1600w, https://geekmonkey.org/content/images/2021/04/pc-2020-full.jpeg 1896w" sizes="(min-width: 720px) 720px"></div><div class="kg-gallery-image"><img src="https://geekmonkey.org/content/images/2021/04/pc-2020-closeup.jpeg" width="1782" height="1782" loading="lazy" alt="Developer PC Build 2021" srcset="https://geekmonkey.org/content/images/size/w600/2021/04/pc-2020-closeup.jpeg 600w, https://geekmonkey.org/content/images/size/w1000/2021/04/pc-2020-closeup.jpeg 1000w, https://geekmonkey.org/content/images/size/w1600/2021/04/pc-2020-closeup.jpeg 1600w, https://geekmonkey.org/content/images/2021/04/pc-2020-closeup.jpeg 1782w" sizes="(min-width: 720px) 720px"></div></div></div></figure><ul><li><strong>CPU:</strong> <a href="https://amzn.to/3e00T8Q?ref=geekmonkey.org">AMD Ryzen 7 3700X 8x 3.60GHz So.AM4 BOX</a> 269.00&#x20AC;</li><li><strong>RAM:</strong> <a href="https://amzn.to/3gGVe9G?ref=geekmonkey.org">32GB (2x 16384MB) G.Skill Aegis DDR4-3200 Dual Kit</a> 96.90&#x20AC;</li><li><strong>Motherboard:</strong> <a href="https://amzn.to/3t1Sypw?ref=geekmonkey.org">ASRock X470 Master SLI AMD X470 So.AM4 Dual Channel DDR4 ATX</a> 121.62&#x20AC;</li><li><strong>SSD:</strong> <a href="https://www.mindfactory.de/product_info.php/256GB-Transcend-110S-M-2-2280-PCIe-3-0-x4-3D-NAND-TLC--TS256GMTE110S-_1252806.html?ref=geekmonkey.org">256GB Transcend 110S M.2 2280 PCIe 3.0 x4 3D-NAND TLC 256</a> 36.40&#x20AC;</li><li><strong>SSD:</strong> <a href="https://www.mindfactory.de/product_info.php/1000GB-PNY-XLR8-CS3030-M-2-2280-PCIe-3-0-x4-NVMe-1-3-3D-NAND-TLC--M280C_1302614.html?ref=geekmonkey.org">1000GB PNY XLR8 CS3030 M.2 2280 PCIe 3.0 x4 NVMe 1.3 3D-NAND TLC</a> 134.70&#x20AC;</li><li><strong>Power:</strong> <a href="https://amzn.to/3aHpNIj?ref=geekmonkey.org">be quiet! Power Zone 750W</a> 103.61&#x20AC;</li><li><strong>Case:</strong> <a href="https://amzn.to/3gIK3xh?ref=geekmonkey.org">be quiet! Pure Base 500</a> 69.03&#x20AC;</li></ul><p>This makes a total of 1884.16 Euros.</p><h3 id="case">Case</h3><p>The case was selected purely based on cost, but I ended up with a good looking case that even has a transparent side panel. The be quiet! Pure Base 500 is a pretty standard ATX case with 2 USB ports, headphone and microphone jack on the frontpanel. It&apos;s optimized for a quiet environment and hence perfect for my needs.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-3.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="1280" height="1280" srcset="https://geekmonkey.org/content/images/size/w600/2021/04/image-3.png 600w, https://geekmonkey.org/content/images/size/w1000/2021/04/image-3.png 1000w, https://geekmonkey.org/content/images/2021/04/image-3.png 1280w" sizes="(min-width: 720px) 720px"><figcaption><a href="https://amzn.to/3gIK3xh?ref=geekmonkey.org">be quiet! Pure Base 500</a></figcaption></figure><h3 id="power-supply">Power Supply</h3><p>As a power supply I bought the be quiet! Power Zone 750W. It&apos;s a midrange power supply and probably a bit overkill for my build, but I wanted to have the option to upgrade to a dual graphics card setup down the road. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-2.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="535" height="535"><figcaption><a href="https://amzn.to/3aHpNIj?ref=geekmonkey.org">be quiet! Power Zone 750W</a></figcaption></figure><p>As with the case the be quiet! power supply comes with a really quiet fan that is barely noticeable even under massive load. The cable management with this power supply is super easy.</p><h3 id="cpu">CPU</h3><p>Strong, stable and ready to support all my needs. I realised I knew too little about the current CPU market and after deliberating with a friend I went with an AMD that just seemed right. 8 cores, 3.6Ghz which go up to 4.4Ghz on an AM4 socket. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-5.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="992" height="558" srcset="https://geekmonkey.org/content/images/size/w600/2021/04/image-5.png 600w, https://geekmonkey.org/content/images/2021/04/image-5.png 992w" sizes="(min-width: 720px) 720px"><figcaption><a href="https://amzn.to/3e00T8Q?ref=geekmonkey.org">AMD Ryzen 7 3700X 8x 3.60GHz So.AM4 BOX</a></figcaption></figure><h3 id="motherboard">Motherboard</h3><p>I picked my motherboard after picking my CPU which needed a AM4 socket. The ASRock X470 Master SLI was the cheapest board available with support for AMD Matisse generation CPUs. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-8.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="600" height="500" srcset="https://geekmonkey.org/content/images/2021/04/image-8.png 600w"><figcaption><a href="https://amzn.to/3t1Sypw?ref=geekmonkey.org">ASRock X470 Master SLI AMD X470 So.AM4 Dual Channel DDR4 ATX</a></figcaption></figure><p>The only downside to this motherboard is that it was delivered with an old BIOS that didn&apos;t yet support Matisse. I had to buy a cheap AMD Athlon 200GE 2x 3.20GHz to perform a BIOS upgrade before finally installing my CPU.</p><h3 id="ram">RAM</h3><p>RAM is one of those things where more is always better. Experiencing the slow down when applications start swapping memory to disc is excruciating and a small upfront investment can save me from it.</p><p>My motherboard supports up to 64 GB of RAM - for now I went with 32 GB to see how long it would last me. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/gskill-aegis.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="1080" height="293" srcset="https://geekmonkey.org/content/images/size/w600/2021/04/gskill-aegis.png 600w, https://geekmonkey.org/content/images/size/w1000/2021/04/gskill-aegis.png 1000w, https://geekmonkey.org/content/images/2021/04/gskill-aegis.png 1080w" sizes="(min-width: 720px) 720px"><figcaption><a href="https://amzn.to/3gGVe9G?ref=geekmonkey.org">32GB (2x 16384MB) G.Skill Aegis DDR4-3200 Dual Kit</a></figcaption></figure><p>The 2x16GB G.Skill Aegis DDR4-3200 have been more than sufficient and my motherboard automatically detected the memory frequency and adjusted its settings. I&apos;m considering upgrading to 64 GB purely because .. &quot;I can&quot; not because there&apos;s any imminent need for me to upgrade.</p><h3 id="storage">Storage</h3><p>My storage setup consists of <strong>2 NVMe PCIe M.2 &#xA0;SSDs. </strong>That&apos;s quite the mouthful, but essentially instead of using old-school HDDs with spinning discs or SATA SSDs these SSDs simply sit on your mainboard allowing for much faster read/write than with those traditional options.</p><p>The SSDs are went with are:</p><ul><li>256GB Transcend 110S M.2 2280 PCIe 3.0 x4 3D-NAND TLC (TS256GMTE110S)</li><li>1000GB PNY XLR8 CS3030 M.2 2280 PCIe 3.0 x4 NVMe 1.3 3D-NAND TLC (M280CS3030-1TB-RB)</li></ul><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-9.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="320" height="320"><figcaption>1000GB PNY XLR8 CS3030 M.2 2280 PCIe 3.0 x4 NVMe 1.3 3D-NAND TLC</figcaption></figure><p>I&apos;m using the 256GB model for my operating system and the larger 1TB model for most of my file storage. If you looked closely enough there&apos;s also an old-school SATA SSD</p><h3 id="graphics-card">Graphics Card</h3><p>Writing this in 2021 where graphics cards are harder to find than gold I consider myself lucky to have gotten my 11GB GeForce RTX 2080 Ti last year.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-7.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="600" height="600" srcset="https://geekmonkey.org/content/images/2021/04/image-7.png 600w"><figcaption>11GB KFA2 GeForce RTX 2080 Ti EX (1-Click OC) Aktiv PCIe 3.0 x16</figcaption></figure><p>While the 2080 Ti is now superseded by nVidia&apos;s new 30X0 generation I think this card will last me for years to come. It&apos;s powerful enough to play games like Cyberpunk 2077 on max settings and it sure as hell can run a lot of machine learning with its 4352 CUDA cores.</p><h3 id="benchmark">Benchmark</h3><p>Benchmarks are silly. That said I ran the PassMark performance test to see how my computer would rank against &quot;the world&quot;. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://geekmonkey.org/content/images/2021/04/image-4.png" class="kg-image" alt="Developer PC Build 2021" loading="lazy" width="816" height="683" srcset="https://geekmonkey.org/content/images/size/w600/2021/04/image-4.png 600w, https://geekmonkey.org/content/images/2021/04/image-4.png 816w" sizes="(min-width: 720px) 720px"><figcaption>Passmark Rating</figcaption></figure><p>Personally I&apos;m more than happy with the result. I wasn&apos;t trying to build the ultra ultimate extreme power machine, but instead a computer that would satisfy my needs and after using this computer for a good 6 months I can say that I succeeded. </p>]]></content:encoded></item></channel></rss>