Hello and welcome to my little nock of the internet. Here I have my blog which mainly contain posts about tech, the outdoors, cooking, and some times mead brewing.

A bit of background information before you step into my world of crazy. I am Lars, a.k.a. looopTools, a Software Engineer living in East Jutland, Denmark. I have 10+ years of experience from industry mainly through student positions, but also as self-employed consultant, or full-time employee. I mainly work in low-level user space, high-level kernel space, and storage systems in general. Besides research and software development, I also love the outdoors and try to go out as often as possible (not enough at the moment) and I am an aspiring author currently working on a few different novels. I also dabble in being a more advance home cook and baker, which you may see some posts about. Finally I like the ancient art of brewing mead, a.k.a. honey wine, and experiment with different flavour combinations and ageing times.

For element in range in C++

12 June 2024

While I was a teacher assistant and during my time helping new employees move from Python to C++, I have gotten one question often. Why can I not do a for loop like this for elm in range(0, 10) and you know what that is actually a fair question. In particular for arithmetic types… but what if you could part of the way?

Well first, let us look at a concept to define a type that must be an arithmetic type. This can be done fairly easy like this template<typename NumericType> concept Numeric = std::is_arithmetic<NumericType>::value;. Basically what this does is to create a type that must be numeric. I will show a little later what happens if you use one that is not.

Next let us generate the range, here I will use std::vector as my range container. This can be done very simply with the function below. This assume that start is smaller than end, otherwise it will not work.

template <Numeric T> constexpr std::vector<T> range(T start, T end)
{
    T size = end - start;
    std::vector<T> data(size);
    for (T i = 0; i < size; ++i)
    {
        data.at(i) = start + i;
    }
    return data;
}

Now let us test it. For this we will make a print_range function which takes start and end and calls range and prints the resulting “range”. Again we use the Numeric template type constrained to be a arithmetic type.

template <Numeric T> void print_range(T start, T end)
{
    std::cout << "[";
    for (const auto val : range(start, end))
    {
        std::cout << " " << std::to_string(val);
    }
    std::cout << " ]\n";
}

Finally our main function, where we test with size_t, int, and float.

int main(void)
{
    print_range(static_cast<size_t>(1), static_cast<size_t>(10));
    print_range(1, 10);
    print_range(1.0, 10.0);
}

The expected result is:

[ 1 2 3 4 5 6 7 8 9 ]
[ 1 2 3 4 5 6 7 8 9 ]
[ 1.000000 2.000000 3.000000 4.000000 5.000000 6.000000 7.000000 8.000000 9.000000 ]

The program should be compiled with clang++ -std=c++20 or g++ -std=c++20. The full program is listed below:

#include <vector>
#include <concepts>
#include <iostream>
#include <string>

template<typename NumericType> concept Numeric = std::is_arithmetic<NumericType>::value;

template <Numeric T> constexpr std::vector<T> range(T start, T end)
{
    T size = end - start;
    std::vector<T> data(size);
    for (T i = 0; i < size; ++i)
    {
        data.at(i) = start + i;
    }
    return data;
}

template <Numeric T> void print_range(T start, T end)
{
    std::cout << "[";
    for (const auto val : range(start, end))
    {
        std::cout << " " << std::to_string(val);
    }
    std::cout << " ]\n";
}

int main(void)
{
    print_range(static_cast<size_t>(1), static_cast<size_t>(10));
    print_range(1, 10);
    print_range(1.0, 10.0);
}

Now let us say we had called print_range with "a" as start and "z" as end. Well then you would get the following compilation error:

main.cpp:34:5: error: no matching function for call to 'print_range'
   34 |     print_range("a", "z");
      |     ^~~~~~~~~~~
main.cpp:19:27: note: candidate template ignored: constraints not satisfied [with T = const char *]
   19 | template <Numeric T> void print_range(T start, T end)
      |                           ^
main.cpp:19:11: note: because 'const char *' does not satisfy 'Numeric'
   19 | template <Numeric T> void print_range(T start, T end)
      |           ^
main.cpp:6:50: note: because 'std::is_arithmetic<const char *>::value' evaluated to false
    6 | template<typename NumericType> concept Numeric = std::is_arithmetic<NumericType>::value;
      |                                                  ^
1 error generated.

Some final remarks. First,this can be made MUCH prettier (which I may show in a later post). Secondly, this is not very optimised and only serves as inspiration. Finally, play around with it.

./Lars

Screens in the class room my hot take

05 May 2024

Before we get started this post is mainly related to some discussion that have been in Denmark over the last two weeks on kids and young adults usage of screens in the classroom. Therefore, most content I will link will be in danish, sorry about that.

It has been discussed lately that screens should be completely removed from the classroom with very little, in my opinion, nuance in the discussion. So I would love to give my hot take on this and what I my qualifications for this? None, really besides having been in school before phones and computers common and being a teaching assistant (TA) in University.

In my opinion there are both pros and cons of allowing devices such as laptops, tablets, and phones being used in the classroom. But for this discussion I will actually have to divide them into two separate categories. Phones and then the rest, and more specifically personal phones and then the rest.

Personal phones have absolutely no place in the classroom, unless a parent needs to get hold of a kid/teenager in an emergency. However, here a parent can call the school and it will get hold of the student, that is possible in Denmark at least. Now why am I so against personal phones? Well, I have literally never seen a phone being used for anything relevant to the class. It has either been for gaming or social media, never anything relevant. This is distraction not only to the student using the phone but also the students sitting around the user. Even though it does not seem like it, this distributes the class and students will miss information. Furthermore, it is very annoying for the teacher/lecture as it very obvious that the student or students is not following along. First of it is a lack of respect that you so clearly show no interest in a class. Secondly, it makes the teacher feel like why the hell did I even prepare for this lecture, even if it s only one student. Thirdly, as a teacher you also know that this student will likely have very simple questions later, just because they did not follow along. You do not believe me? Try being a TA for one lecture. Finally, a thing I have heard from students and parents “It cannot be that obvious that they are using a phone”! I sadly do not remember where I have this quote from but “No one is that interested in their crotch”. Therefore, I believe that phones should not be present in a classroom. When it comes to breaks, I would prefer a strong in the moment social relations between the students and one that is not build solely on technology, but I am not sure that is feasible.

On the other hand when it comes to Tablets and Computers they have a significant usage in the classroom. Be it for not taking notes, either directly in slide PDFs or whatever fancy note program the student uses. In many cases a pen and paper could easily replace it, returning to a slide print outs is a good idea. Personally I recall better taking notes by pencil than by typing on a computer but I gather this is not for everyone. Additionally when writing reports, doing plots, slides, and so on a computer or tablet is useful. However, as a phone they can take focus from the class either by students gaming or watching videos, most of the time it is silent. But sometimes there is a constant clicking or a student saying a little to loud “nice move” or something like that, and here it become a problem. Not just for the students, but also the teacher. So computers/tablets have their pros and the cons.

But how do we solve the cons? I have two suggestion and one is rather radical so we will start with that. Government / school issued devices that has a limited amount of apps installed, with blocks for installing more, and network block of social media and online gaming sites. The devices should have the bare minimum a word processor, a spread sheet, and presentation slides. As the students mature more can be added. For instance if a student is following a STEM line biology, physics, and programming tools can be added. If in stead a economic line is followed finance applications can be added. This would to a certain extend alleviate a lot of the problem. However, I am very much against internet censorship, so this even hurt to suggest.

The less dramatic option is to confiscate the offending student(s) device for the rest of the class. A repeat offender can in the worst case be forbidden to bring devices to class for a while. Although this will limit the student to taking notes by hand, it is also a way to show that there are consequence for once actions and hopefully it would help the student grow.

Finally, it is not just teachers and schools that have to do something here. Parents also have to address their kids usage of technology and dependency on it. A human should be able to survive 45 minutes without looking at their phone or checking social media.

During my PhD I attend a talk about how the maths department had adopted a way of producing teaching material that was not more than fifteen minutes long so it fit with the modern attention span. I raised the question, if it was not wrong to adjust to this and rather we should expect more of our students. The speaker literally could not see a problem with accepting humans getting a short attention span and “we have to adopt to the needs of our students”. I highly disagree. Because most jobs require you to be able to focus for more than fifteen minutes and if the educational system do not see that, then we have a huge problem.

Technology is wonderful and powerful. But we need to address how we use it, how dependent we are on it, and how we can reduce its usage in some parts of our lives. And I mean this even comes from a Software Engineer.

./Lars

Writing my notes by hand

01 May 2024

In Why I use notebooks I discussed why I prefer physical notebooks over digital notebooks. However, I feel like it is time that I revisited this as I have gotten some findings that surprise me.

First of let talk about how I organise my notes. I have two types of not temporal and persistent. These classification are necessary as it effect where I write my notes.

Temporal notes, are notes I will only need until a task is done. This could be how a certain data protocol wants data organised in packet type A, B, and C. I am likely not needing that for a long time as I can always read the code I write later, when it is related to work. These notes I most often write on loose leafed paper or on a note block. If I believe a temporal note is worthy of being persistent I will transfer it to my persistent note storage. Which is a notebook.

Persistent notes I keep in one or more notes, I am addicted to notebooks. Persistent notes are notes where I feel this information could be useful further down the line. For work it could be a specific Software library and what it does or a software architecture and how it is defined. Privately, it could be a “journal” of a fermentation process and what I messed up in case I did.

This split between temporal and persistent notes, lets my mind run garbage collection and throw out notes I no longer need from my valuable head space. Or at least it avoids it from taking up prime seats. Additionally, by having my notes in notebooks and often sorted by topic I have the options to quickly find my notes and go through them. You could do the same on a computer of course, but there is something about the tactile feeling that makes it more fitting for me.

I even bring a notebook to meetings and take notes as detailed as possible, and I have one just for meetings. This does two things, first of it shows other participants in the meeting that I take it seriously and want to grasp what the meeting is about. I have gotten compliments on this multiple times. Secondly, it enables me to write very concise summaries of meetings, which I can then send by email to all participants so we have a record of what was agreed. We can even discuss if my notes are correct. Sometimes my meeting summaries have let to a discussion about something that was misunderstood and once (just once) have it saved my behind, because half the team had completely misunderstood a task.

Now we move on to what is actually the most important part of this for me. Recollection! I find that weeks after I have taken a note by hand and placed it in my persistent note storage, then I can actually recall it. This means that I do not need to look something up again, which then means I am saving time. And time is simply a resource you can never get back.

I am unsure if it is the tactile feeling of pen to paper that makes remember but there is something about it that is just different. I think what has changed the most is how I separate my note taking between temporal and persistent. I think it makes my brain more susceptible to remembering, but I am not sure.

I will keep updating you on how this continues for me.

./Lars

Migrating QueryC++ from GitLab to Codeberg and transferring ownership to Obsidian Wolf Labs

22 April 2024

This post covers two things, the migration of QueryC++ from GitLab to codeberg.org, and transferring code ownership to Obsidian Wolf Labs. Below I will explain why I have made these choices and I will start with the migration.

Originally, QueryC++ was hosted on GitHub and the code from back then is still available there, if you are interested here is the link. I moved to GitLab because I add the time (and still have) had some problems with how Microsoft was changing GitHub, I disagreed with the changes on a philosophical level. This has not really changed and I am not to happy about Microsoft Copilot either (a thing I will not link). So I moved to GitLab and I have been really happy with it and I have no plan to fully abandon the platform, I have not fully abandoned GitHub either. So why am I moving QueryC++? Well quite frankly I like the philosophy of Codeberg and I think it is the right home for an open source library like QueryC++ to live and thrive. I like the ideas behind Codeberg and that it is host in Europe. The code is now available here: codeberg/queryc++.

However! For QueryC++ I will try something new. I will keep the remote host for GitLab active (if I can figure out how) and then push the code to both repositories as a backup. But all issues and stuff will be moved to Codeberg.

Next, transferring ownership to Obsidian Wolf Labs. Who are they? Well they are… me! I am the owner and sole engineer at Obsidian Wolf Labs, so why the transfer? Well it feels right to give do this and that is about it. The code will remain under BSD license and it is unlikely to ever change. I just felt like it was time to take this particular plunge.

So that is it. I hope you will keep following the development of QueryC++.

./Lars

Why it is okay that Gnome is against theming

31 March 2024

Over the last couple of years that have been some discussion in the Gnome community, in particular on r/gnome about why Gnome is against theming. This article from OS news GNOME to prevent theming wider community not happy, explains the issue fairly well. Then there are a some blog post where it is discussing this in more detail: GNOME developers against themese?, Why and how libadwaita prevents theming?. The OS news article claims that the community in general is frustrated, which I sort of agree with based on what I have read on the subject over the past five years or so. Below is an comment take from the Why and how libadwaita prevents theming? discussion on gnomes discourse (Link to the actual comment)

I don’t dispute the problems with replacing stylesheets and I think distros should stop it, but the artificial restrictions are stupid. If I want to break my app, let me please…

In contrast Gnome has a post about Restyling apps at scale and what some of the issues are and a lot of the discussion was in particular sparked by this post Please don’t them our apps. To outline the points abstractly it boils down to:

  • It is difficult/impossible to ensure 100% compatibility between an app and all themese
  • Developers are being blamed when an app breaks
  • The UI and experience breaks
  • The consistency across the system UI may break (side note this is one of the reasons for libadwaita to be created)

And that is okay. Why is it okay? Should Linux not be a system where you can play and do what ever the hell you want? Well it still we can just use another desktop environment (DE) or windows manager. We can still play around with what we want. We just should not expect developers of our apps or DE’s to adapt to us accordingly because if they had to do that for all users, the development process would be hell. But this is actually not why I agree with Gnomes perspective. So let me go into a little more detail on why I agree with it.

If we look at macOS, there is one design, one style to follow, if you design apps in swiftUI or earlier versions of Apples development environments. This gives all apps a somewhat familiar look and often menus are similar or at least close enough that you can guess where you are going. This combined with macOS over all strides towards a stream lined UI gives an overall complete system User Interface (UI) and User Experience (UX). This, combined with their ecosystem, gives Apple a very good edge to retain users, as they get comfortable. So why is this relevant? Well for a long time on Linux, there was no one design language, no unified UI and therefore no unified UX. Although this is both good and bad it can scare some people away from Linux, something I (at least) is not interested in. However back when Gnome went from version 2 to version 3, we so the dawn of a system that could provide this. Side note… if you want to read about how angry the gnome community actually can get go back and read about the swap from Gnome 2 to 3 flame wars everywhere! Gnome got a modern look, it felt more like macOS, a thing many hated, but it was pretty and shinny. Since then Gnome has morphed, not so much in design but rather philosophy. It has become a more unified attempt at providing a good UX for Linux right out of the box. Just look at how good Fedora with vanilla (unmodified) Gnome is out of the box! As this philosophy have evolved, big ideas have emerged and one of these are libadwaita. libadwaita gives you great building blocks to create GTK apps for Gnome, that look like Gnome Core Apps (some are still being updated). This is good, it provides a similar UX to what Apple provides and as both a macOS and Fedora/Gnome user this is very much appreciated as it gives some coherence. I do not mean because the two system look similar (although they do), but more that it feels like we have a healthy competition almost. That I cannot break it with theming is nice, that I do not have to is nice, that everything looks the same is nice and that is something I really appreciated. In particular after spouts of using XFCE (although it is an awesome DE) or some window manager like FluxBox over whatever I fancied a given day.

Another thing I would like to mention is that KDE with Plasma and in particular KDE Neon seems to be going in a similar direction. I am guessing this is in particular down to how QT are handling things at the moment or maybe I am just projecting. I do not see this as a bad thing for mainstream distros like Fedora, Ubuntu, and Mint to seem less daunting to newcomers, a thing I think we need. Finally, I hope COSMIC by system76 will provide something similar in terms of coherent UX as I expect they soon with Pop!_OS will have a mainstream distro.

./Lars