<![CDATA[rolisz's blog]]>https://rolisz.ro/https://rolisz.ro/favicon.pngrolisz's bloghttps://rolisz.ro/Ghost 5.54Fri, 19 Jul 2024 12:32:25 GMT60<![CDATA[Growing roots]]>

Last week I was on vacation in Spain. I was there for only a week, but that was enough for the barista at the closest coffee shop to remember how I like my coffee: "Oat milk, right? And sorry, we don't have sugar"[1]. I had

]]>
https://rolisz.ro/2024/07/19/growing-roots/668d5fc764d0f40001ea5cf9Fri, 19 Jul 2024 12:29:56 GMTGrowing roots

Last week I was on vacation in Spain. I was there for only a week, but that was enough for the barista at the closest coffee shop to remember how I like my coffee: "Oat milk, right? And sorry, we don't have sugar"[1]. I had been there only 3 times before maybe, but she made the effort to remember that. This made me feel much cozier and it made me love Alicante even more.

And this got me thinking about how belonging to a community feels good, even if it usually means growing deep roots in a place. I used to travel a lot. In 2016 I spent 3 months on the road. In 2017 I took 43 flights.

But I don't miss that lifestyle anymore. I'm glad I found my place here in Oradea and I'm glad my place is here in Oradea[2]. After 6 years living here, I have so many memories, most of them about being part in different communities.

There's the time when I got stuck in mud with the car in front of the house. My neighbor came out dying of laughter, but he helped me get the car moving. Today he's the person I drink coffee with most often.

There's the first presentation I gave about machine learning at the local tech hub. Now I have a lot of friends there. Together we've done many events. We hang out to play board games together. We've gone on hikes and we've even started side projects together.

Then there was the time when somebody from our church was quite sick and needed blood donations, so many of us went to the local donation center for him (and half of us were turned away for various reasons, but that's another story).

Most of the time engaging with people doesn't come easily to me, but I believe it's crucial and well worth the effort. I believe these ties that I build with different communities are what makes life worth living, and, some of them are even the point of life.

And I think it's almost impossible to build those ties when you don't have stability in one place. It takes a long time and much purposeful effort to be able to nourish these relationships. Sometimes I realize I've now lived here in Oradea for twice as long as in Zurich and I'm surprised. But then I think I'm not even halfway there to how close I can become with people here.

Growing roots
Enjoying the coffee and a really good book in Alicante

  1. I get that it's a specialty coffee shop and you're supposed to enjoy the pure taste of coffee, but still, no sugar at all? ↩︎

  2. It's a really nice city! ↩︎

]]>
<![CDATA[Keeping track of podcasts]]>One of the things that has sparked a lot of joy in my life in the last two years was building an app to keep track of "podcasts" I want to listen to. I'm using podcast in a very loose sense here, it's more

]]>
https://rolisz.ro/2024/06/19/keeping-track-of-podcasts/665b670c7d90a40001d09169Wed, 19 Jun 2024 16:23:59 GMT

One of the things that has sparked a lot of joy in my life in the last two years was building an app to keep track of "podcasts" I want to listen to. I'm using podcast in a very loose sense here, it's more like any content where I need to pay attention with my ears, from actual podcast episodes, to Youtube lectures from Stanford, to courses on whatever platforms, to sermons in mp3 format, to audio/video on Substack, etc.

I previously would always lose track of things, forget about them and actually not get around to listening to things. And when I would actually have time to listen to something, I would not know what to listen to.

So I decided to make an app for this. Existing apps didn't really work because I was listening content across a very wide variety of platforms.

After having heard the startup adage to start scrappy, the first iteration was a Google Sheet.

Keeping track of podcasts

Initially it just had a couple of columns, which kept growing in number. At some point I started color coding them (for example by length so it's easier to find something based on how much time I have). This got to tags, starring favorite episodes, notes.

Then I started adding statistics. I learned that you can do a sort of SQL query in Sheets.

Keeping track of podcasts

I love statistics. I love tracking things.

I even learned how to do some basic automation in Sheets and how to add "buttons".

But, eventually I outgrew the spreadsheet. After almost two years, I decided it's time to graduate to something more serious, so I built an app in Django. The only design awards it will win are for worst looking app, but it works very well for me. And it only has to work for me!

Keeping track of podcasts

Now I can filter things even better, I can drill down quickly by speaker, series, tags.

Using this app has helped me listen to an average of 9 episodes per week. Previously, if I listened to 1 episode per week was good. So I'm glad I'm more organized and I'm able to learn more.

But, there's some other important lessons: home cooked software. This is an app I built for myself. It will have 1 user for the rest of my life (maaaaybe someone else from my family will use it, but unlikely). It means I can get away with things like using SQLite3 for the database. I can have one 700 line file for the views. I can mix Javascript libraries like there is no tomorrow. It's tremendously fun to develop this. And I think more programmers should build this sort of personal use software. It's small, it's niche, but over time it can get to do some pretty impressive stuff. And you know what? It's super fast. Network round trip from my NAS that is sitting on the same LAN with me is miliseconds. The app is super snappy, much faster than almost any app made by big tech or an american startup. With a minimum of effort into avoiding cascading SQL queries, it's possible to get all page loads to be under 300 ms.

I remember reading about Linus, who invented his own programming language, and wrote his own editor and a ton of other apps for personal usage. At the time I was like why would you waste your time on that? Now I'm starting to understand him and I'm starting to expand this app to slowly do more things.

But there's an interesting twist: I tried the same for tracking articles to read. It doesn't work at all. I add things and I never read them. Why? Because new articles to read are always easy to come by, Twitter, Reddit and Hacker News are always full of the latest shiny stuff, so there's no natural context for me to go and find saved articles to read. So I'll have to do some more thinking about how to solve that problem.

]]>
<![CDATA[Blogging in 2024]]>I've been blogging for 14 years now. I started off writing in Romanian, while I was in highschool. I credit writing all those posts with helping me get a really good grade at my Romanian exam at the end of high school.

Then in university, I wrote a

]]>
https://rolisz.ro/2024/06/08/blogging-in-2024/6643014adc5a0500017e8cfdSat, 08 Jun 2024 05:35:50 GMT

I've been blogging for 14 years now. I started off writing in Romanian, while I was in highschool. I credit writing all those posts with helping me get a really good grade at my Romanian exam at the end of high school.

Then in university, I wrote a lot mostly about school stuff: my blog was really popular with my class mates, and even the professors started mentioning it now and then.

After I moved to Switzerland, I used the blog as a way to let my friends know what I was up to. I blogged a lot about where I went and I what I did.

After I moved back to Romania I still kept posting various things, peaking during the pandemic when I wrote a record of 62 posts in 2020.

But then I started posting less and less, writing 6-7 posts per year. And it's not just the frequency that has dropped, but the content: in the last 2 years, I barely wrote about about personal stuff.

In a weird way, I find myself reticent to expose a lot of information online. I've been posting mostly about programming and very little about me, even on Facebook or Twitter. I recently listened to Mary Harrington talking about Digital Modesty and it resonated a lot with me. I don't really want to share online what's going on in my personal life, neither here nor on social media.

So what's next for my blog? Well, I still plan to write on my blog. I actually have several ideas for posts, but I think they will be more different than before. The programming ones will definitely continue, but there will also be other things that I've learned in various aspects of life or just random philosophical and Christian meditations that I do. Also, the short TIL posts will continue, because they are very useful for me.  But expect much fewer things about rolisz. So, my dear 55 email subscribers and 60-ish RSS subscribers, if that doesn't sound interesting, no hard feelings if you unsubscribe, thanks for following me so far!

]]>
<![CDATA[FastAPI server stuck on Windows]]>When developing FastAPI based web services, I sometimes run into a weird issue on Windows: I kill the uvicorn process, but the port it used remains in use and I get responses in the browser, even though the server appears to not be running.

Today I learned what's

]]>
https://rolisz.ro/2024/02/27/fastapi-server-stuck-on-windows/65ddedae958f7e00014b1455Tue, 27 Feb 2024 14:19:05 GMTWhen developing FastAPI based web services, I sometimes run into a weird issue on Windows: I kill the uvicorn process, but the port it used remains in use and I get responses in the browser, even though the server appears to not be running.

Today I learned what's the problem: uvicorn spawns some child processes and if they don't get cleaned up properly, Windows won't clean up after parent process either (which is the one holding the port open).

First, we have to see the PID of the process that is listening on the socket and then we have to look for any processes that are it's children:

> (Get-NetTCPConnection -LocalPort 8000).OwningProcess
38876
> wmic process where ("ParentProcessId=38876") get Caption,ProcessId
Caption     ProcessId
python.exe  1824

You can then use your favorite process killing tool: Task Manager, Resource Monitor or taskkill /F /PID 1824

]]>
<![CDATA[Pushing Git repos between random machines]]>Until today, I always used Github as a server, pushing my code changes there, then pulling from there to the deployment server (in a more or less automated fashion).

But today I ran into a locked down VM that blocked Github, so I had to find alternative ways to get

]]>
https://rolisz.ro/2024/02/15/pushing-git-repos-between-random-machines/65cde76c958f7e00014b1424Thu, 15 Feb 2024 10:35:08 GMTUntil today, I always used Github as a server, pushing my code changes there, then pulling from there to the deployment server (in a more or less automated fashion).

But today I ran into a locked down VM that blocked Github, so I had to find alternative ways to get my code there. The farmer method is to zip the code and scp it there, but I learned that you can just git pull code between random machine, with no special software needed.

The process is as follows:

On the remote VM:

> mkdir path/to/repo
> cd path/to/repo
> git init
> git checkout -b tmp

On the local machine:

> git remote add remote_name username@address:/path/to/repo
> git push remote_name master

On the remote VM:

> git checkout master

It's best if you have ssh authentication using public keys and don't forget to set up the SSH config correctly for Git.

It's nice when something in the software world just works!

]]>
<![CDATA[Using custom SSH keys with Git]]>As a freelancer, I work for many clients, who have their source code in many different places, often self hosted. I generally create new SSH keys for each separate platform. Unfortunately, Git doesn't provide an option for what keys to use, you have to configure this in the

]]>
https://rolisz.ro/2023/09/25/using-custom-ssh-keys-with-git/65119f488f188c00019848c1Mon, 25 Sep 2023 14:59:02 GMTAs a freelancer, I work for many clients, who have their source code in many different places, often self hosted. I generally create new SSH keys for each separate platform. Unfortunately, Git doesn't provide an option for what keys to use, you have to configure this in the ~/.ssh/config file:

Host private.git.server.com
    PreferredAuthentications publickey
    IdentityFile ~/.ssh/key_file_for_this
    
]]>
<![CDATA[Per folder custom titles for Jupyter notebooks]]>I work on many different projects and I use some ActivityWatch for automatic time tracking. AW has a feature where you can use regexes to match window titles to assign them automatically to different projects. But, unfortunately, Jupyter Notebook have a "{filename} - Jupyter Notebook" title in the

]]>
https://rolisz.ro/2023/07/05/custom-titles-for-jupyter/651131708f188c0001983c03Wed, 05 Jul 2023 14:39:12 GMT

I work on many different projects and I use some ActivityWatch for automatic time tracking. AW has a feature where you can use regexes to match window titles to assign them automatically to different projects. But, unfortunately, Jupyter Notebook have a "{filename} - Jupyter Notebook" title in the browser, so it's hard to match on them. I'd rather not name all the notebook with a prefix of the project, so I went looking for a different solution.

I quickly found a way to customize the title shown in the browser, but not in a dynamic way (aka dependent on the folder I'm in).

To get dynamic custom titles, I had to first update in the config file used by Jupyter (~/.jupyter/jupyter_notebook_config.py) the following two values:

c.NotebookApp.extra_template_paths = ["~/.jupyter/templates/"]

import os
from pathlib import Path

cur_dir = os.getcwd()

projects = {
        'folder_name1': 'Project 1',
        'folder_name2': "Project 2",
}

cur_env = Path(cur_dir).name
for p in projects:
    if p in cur_dir:
        cur_env = projects[p]
    
c.NotebookApp.jinja_template_vars = {"env_name": cur_env}

The first line add a new path where Jupyter looks for HTML templates. Then I get the name of the current working directory. I look for some folder names and if they match, I get the project name, otherwise I'll use as project name the name of the current working directory. On the last line I inject the env_name variable into every Jinja2 template used by Jupyter.

Then I copied into ~/.jupyter/templates/page.html the template file from the notebook project and at the end of the last script block, I added the following:

{% if env_name %}
window.title += " - {{ env_name }}";
document.__defineSetter__('title', function(val) {
    document.querySelector('title').childNodes[0].nodeValue = val + " - {{ env_name }}";
});
{% endif %}

First I check if env_name is set. If it is, I add Javascript code which will add the value of it to the window title, and also will update the window title whenever it changes (such as when you rename the file).

This is a bit hackish and when the notebook templates update, I should update my own copy as well. Luckily, it doesn't change too often, there being only 5 commits since 2020.

]]>
<![CDATA[Setting up systemd services]]>I often need to set up a VM with a server to run some ML models. While I could package things in Docker and deploy them like that, whenever possible, I like to do things more directly, using systemd services.

To create a systemd service, create a file called /etc/

]]>
https://rolisz.ro/2023/06/13/setting-up-systemd/651131708f188c0001983c02Tue, 13 Jun 2023 14:21:16 GMT

I often need to set up a VM with a server to run some ML models. While I could package things in Docker and deploy them like that, whenever possible, I like to do things more directly, using systemd services.

To create a systemd service, create a file called /etc/systemd/system/my_service.service. This will create a service that will be started by the root user, but it can be run under another user.

There are three necessary components of a systemd service: unit, service and install.

[Unit]
Description=Service to run my fancy API
After=network.target
StartLimitIntervalSec=100
StartLimitBurst=5

[Service]

Restart=always
RestartSec=10
User=ubuntu
Group=www-data
WorkingDirectory=/home/ubuntu/app
Environment="PATH=/home/ubuntu/.venv/bin"
Environment="API_KEY=123456"
ExecStart=/home/ubuntu/app/.venv/bin/uvicorn main:app --my_params
StandardOutput=append:/var/log/app/stdout.log
StandardError=append:/var/log/app/std_error.log

[Install]
WantedBy=multi-user.target

First, we have to describe the service. Then we tell it when to start: after the network is up and running. We tell it to stop restarting if it crashes 5 times in 100 seconds.

Then we describe the actual service. We tell it to always restart (keeping the above mentioned limits), waiting 10 seconds between restart. We can specify a user, group, working directory and environment variables in which to run the program.

With ExecStart we can specify the program to run. If it's a simple FastAPI server, I usually write out the full path here. But, if it's something more complicated, where I might have to play around with the command more often, I create a bash file and start the service using that: ExecStart=bash /home/ubuntu/start_service.sh.

Then we redirect the standard output and the error stream to files. The last bit I have no idea what it is, but it's needed.

To start the service you have to first run sudo systemctl daemon-reload. You will have to run this every time you change the service definition file. Then you can start the service with sudo systemctl start my_service (the name of the file, without .service).

To enable starting it automatically on startup, you can run sudo systemctl enable my_service.

To see the status of the service: sudo systemctl status my_service. If it's crashing, you can look at journalctl -xe or you can check the log files you configured.

I recommend always writing out full paths, especially for the Python interpreter, to avoid any ambiguities as to what will run your program or what exactly will be loaded.

And now you know how to create a simple systemd service!

]]>
<![CDATA[Telegram notifications from Jupyter Notebooks]]>When running long running code in Jupyter, I want to get notified when it finished so that I can get back to it. There is an extension to do that with browser notifications, but there are times when I leave the computer while waiting for an ML training to finish.

]]>
https://rolisz.ro/2023/05/23/telegram-notifications-from-jupyter-notebooks/651131708f188c0001983c01Tue, 23 May 2023 08:41:51 GMT

When running long running code in Jupyter, I want to get notified when it finished so that I can get back to it. There is an extension to do that with browser notifications, but there are times when I leave the computer while waiting for an ML training to finish.

For long running CLI commands there is the ntfy, a command line tool that allows you to send notifications through a lot of channels.

So I hacked the two together to get some code that automatically messages me on Telegram when a cell finished more than 60 seconds after it started. This extension is registered automatically on the startup of any IPython and Jupyter notebook (even if they are installed in random virtual environments). Why Telegram? Because I already have it installed and it seemed like the easiest integration to set up.

The code has to be placed in ~\.ipython\profile_default\startup\01notification.py. You can place multiple files in this folder and they are loaded in lexicographic order, so you should prepend a number if you care about order. First, a couple of magic imports:

import time
import subprocess

from IPython.core.getipython import get_ipython
from IPython.core.magic_arguments import argument, magic_arguments, parse_argstring
from IPython.core.magic import (register_line_magic)

To send the notification using ntfy, I'm simply calling the program using subprocess. The path resolution used by subprocess is not clear to me, so I had to use the full path to the executable.

def display_notification(message):
    subprocess.run([r"C:\Users\Roland\AppData\Local\Programs\Python\Python310\Scripts\ntfy.exe", "-b", "telegram", "send", message]) 

Then we define some variables, one for the threshold for notification and the other one for remembering the start of the execution:

autonotify_after = 60
run_start_time = None

And now we define the magic function (that's what these things are called in IPython). It has two arguments, one to override the duration of the threshold for notifications and the other for the default message. I copy pasted the decorators as you see them. After parsing the arguments (which come as a string), we register two event handlers: one to run before a cell is executed and one after a cell is executed.

@magic_arguments()
@argument(
    "-a", "--after", default=None,
    help="Send notification if cell execution is longer than x seconds"
)
@argument(
    "-m",
    "--message",
    default="Cell Execution Has Finished!",
    help="Custom notification message"
)
@register_line_magic
def autonotify(line):
    # Record options
    args = parse_argstring(autonotify, line)
    message = args.message.lstrip("\'\"").rstrip("\'\"")
    if args.after:
        global autonotify_after
        autonotify_after = args.after
    ### Register events
    ip = get_ipython()

    # Register new events
    ip.events.register('pre_run_cell', pre_run_cell)
    ip.events.register('post_run_cell', lambda: post_run_cell(message))

The handler to run before a cell is simple: we just record the start time of the run.

def pre_run_cell():
    global run_start_time
    run_start_time = time.time()

The second handler is slightly more complex. We look at the output of the last cell and append it to the message if it's not an "empty" value. We check how long has elapsed to know whether to show a notification or not:

def post_run_cell(message):
    # Set last output as notification message
    last_output = get_ipython().user_global_ns['_']
    # Don't use output if it's None or empty (but still allow False, 0, etc.)
    try:
        if last_output is not None and len(str(last_output)):
            message = message + "\n" + str(last_output)
    except ValueError:
        pass # can't convert to string. Use default message
    
    # Check autonotify options and perform checks
    if not check_after(): 
        return
    display_notification(message)


def check_after():
    # Check if the time elapsed is over the specified time.
    now, start = time.time(), run_start_time
    return autonotify_after >= 0 and start and (now - start) >= autonotify_after

The last piece of magic is to run this function. The other blog post I was inspired by said you should delete the function for things to work properly, so I did that as well:

ipython = get_ipython()
ipython.magic('autonotify')
del autonotify

And voila, now you will get Telegram notifications automatically when your model  finishes training! Setting up a Telegram bot is left as an exercise for the reader.

]]>
<![CDATA[TIL: Recreating tmux socket]]>I use tmux as a multiplexer and for running some long-running commands on servers. Today I encountered a weird issue: when trying to attach to an existing tmux session (or when trying to do anything with tmux), I would get the following error:

can't create socket: Permission denied
]]>
https://rolisz.ro/2023/04/28/tmux-socket/651131708f188c0001983c00Fri, 28 Apr 2023 08:38:00 GMTI use tmux as a multiplexer and for running some long-running commands on servers. Today I encountered a weird issue: when trying to attach to an existing tmux session (or when trying to do anything with tmux), I would get the following error:

can't create socket: Permission denied

A quick search on Kagi revealed that it might be permissions issue. I tried resetting the permission of the /tmp/tmux-* folders, but it didn't help.

What did work however was recreating the socket the tmux server uses to communicate with the tmux client. To do that, you have to run the following command:

killall -s SIGUSR1 tmux

And then the tmux attach worked perfectly, showing me all the windows from my old session.

I have no idea why this happened though.

]]>
<![CDATA[Interview about deep fakes]]>Yesterday my friend Marius Corici called me and asked me if I wanted to be interviewed (again) by ProTV, this time about deep fakes. This time it was a Zoom interview, and they asked me not to blur my background (ugh). And I had my second 15 seconds of fame

]]>
https://rolisz.ro/2023/04/08/interview-about-deep-fakes/651131708f188c0001983bffSat, 08 Apr 2023 15:37:27 GMT

Yesterday my friend Marius Corici called me and asked me if I wanted to be interviewed (again) by ProTV, this time about deep fakes. This time it was a Zoom interview, and they asked me not to blur my background (ugh). And I had my second 15 seconds of fame last night (the actual interview was 5 minutes long, but they cut out most of it).

0:00
/

tl;dr: Deep fakes are already here, they are getting easier and easier to make and it's a cat and mouse game to detect them.

Source: ProTV

]]>
<![CDATA[Making a loudness monitor for online meetings]]>As I work from home 90% of the time, I run into a small issue during meetings: I sometimes speak too loudly. Before my daughter Gloria arrived, this was something that annoyed my wife and others in the house, but now, when Gloria is sleeping, this is not just an

]]>
https://rolisz.ro/2023/02/02/making-a-loudness-monitor/651131708f188c0001983bfeThu, 02 Feb 2023 15:43:56 GMT

As I work from home 90% of the time, I run into a small issue during meetings: I sometimes speak too loudly. Before my daughter Gloria arrived, this was something that annoyed my wife and others in the house, but now, when Gloria is sleeping, this is not just an annoyance, it's a BIG problem, because nobody wants to wake up a toddler.

While I do have monitoring that alerts me via Signal that I'm speaking too loud (my wife), I wanted to write a program to do that all the time, in the spirit of using programming to make my life nicer.

So I started to look for a Python library that can give me information about the sound level from my microphone. A quick Kagi search revealed several options, but sounddevice seemed like the best one.

The first step is to identify the microphone. For that I need the name as it's know to the operating system. I can get that by running the following code in a Python console:

> import sounddevice as sd
> sd.query_devices()
0 Microsoft Sound Mapper - Input, MME (2 in, 0 out)
1 Microphone (Yeti Stereo Microph, MME (2 in, 0 out)
2 Microphone (WO Mic Device), MME (2 in, 0 out)
....
> sd.query_devices()[1]['name']
'Microphone (Yeti Stereo Microph'

I get a long list of stuff, but I see something with Yeti in the name so I grab that one.

Now let's start listening to the microphone. Sounddevice offers a callback based API, where it passes along the raw audio data received from the microphone. From that, I estimate the loudness by calculating the norm of the sound:

import numpy as np
import sounddevice as sd


def print_sound(indata, frames, t, status):
    volume = np.linalg.norm(indata) * 10
    print(volume)


name = 'Microphone (Yeti Stereo Microph'
with sd.InputStream(device=name,callback=print_sound):
    for i in range(5):
        sd.sleep(1000)

Running this gives something as follows. Can you guess where I snapped my fingers?

0.3724626451730728
0.6015866994857788
0.9348087012767792
0.7427176833152771
0.8615989238023758
0.7162655889987946
0.5638395622372627
0.7117109000682831
59.17434215545654
50.70761203765869
20.951063632965088
14.069621562957764
9.29598331451416
5.908793210983276
3.782018721103668
2.402055263519287
1.7902085185050964
1.1522774398326874
0.793280228972435

The next step is to make it warn me when I speak too loud. For this I keep a buffer of the latest sound intensities in order to be able to detect when either something loud has been happening for a long time or if a really loud noise happened in the last frames:

import time
from collections import deque

import numpy as np
import sounddevice as sd


last_alert = time.time() - 10
q = deque(maxlen=200)


def print_sound(indata, frames, t, status):
    global last_alert
    volume_norm = np.linalg.norm(indata) * 10
    q.append(volume_norm)
    last_elements = [q[i] for i in range(-min(50, len(q)), 0)]
    recent_avg_sound = sum(last_elements) / len(last_elements)
    num_high_count = len([x for x in q if x > 20])
    if num_high_count > 30 or recent_avg_sound > 50:
        if time.time() - last_alert > 10:
            print(f"You are speaking at {volume_norm:.2f}. Think of Gloria!\a")
            last_alert = time.time()


name = 'Microphone (Yeti Stereo Microph'
with sd.InputStream(device=name,callback=print_sound):
    while True:
        sd.sleep(1000)

Now, when running from a Terminal (either on Windows or Linux), this will make a bell sound if in the last 5 seconds (that's about 200 frames) there have been more than 30 frames with loudness over 20 or if in the last second the average was over 50 (this would mean a really loud sound).

If you want to run this outside of Terminal, you can use beepy for example to make sounds and replace the print statement with this:

from beepy import beep

beep(sound="error")

To run this on startup on Windows, I created the following ps1 script in the startup folder:

C:\Users\Roland\Programming\mic_check\.venv\Scripts\pythonw.exe C:\Users\Roland\Programming\mic_check\mic_check.py

Another improvement would be to make it easier to see current loudness and to be able to quit it easily (because the ps1 script runs in the background). For this I used the infi.systray library on Windows:

from infi.systray import SysTrayIcon

def quit(systray):
    global still_on
    still_on = False


menu_options = ()
systray = SysTrayIcon("icon.ico", "Mic check tray icon", menu_options, on_quit=quit)
systray.start()
still_on = True
name = 'Microphone (Yeti Stereo Microph'
with sd.InputStream(device=name,callback=print_sound):
    while still_on:
        sd.sleep(1000)

And now, hopefully I'll learn to control my loudness better!

You can find the full code here.

]]>
<![CDATA[Learning in public: Transcribing podcasts with Whisper]]>My next project where I learn in public is about doing transcripts with Whisper.

Whisper is a quite good automatic speech recognition model that is open source and can run on your own computers, provided you have a GPU.

I prefer reading to listening, so I wanted to transcribe my

]]>
https://rolisz.ro/2022/11/23/transcribing-podcasts-with-whisper/651131708f188c0001983bfdWed, 23 Nov 2022 15:32:34 GMT

My next project where I learn in public is about doing transcripts with Whisper.

Whisper is a quite good automatic speech recognition model that is open source and can run on your own computers, provided you have a GPU.

I prefer reading to listening, so I wanted to transcribe my long list of things to listen to, ideally in automatic way.

The first step was to use Whisper to transcribe anything dropped into a folder. I recorded myself while doing this and I plan to do at least 2 more sessions of this.

Lesson learned: OBS studio uses a lot of resources and sometimes the recording has issues because of Whisper also hogging up resources.

]]>
<![CDATA[Where's rolisz?]]>The last couple of months have been quite busy for me. Consultant life is starting to pick up the pace (at least some aspects of it), especially the traveling part.

It started in March: Make IT in Oradea, a startup incubator where I'm involved as a mentor, had

]]>
https://rolisz.ro/2022/07/15/traveling-again/651131708f188c0001983bfcFri, 15 Jul 2022 10:50:38 GMT

The last couple of months have been quite busy for me. Consultant life is starting to pick up the pace (at least some aspects of it), especially the traveling part.

Where's rolisz?

It started in March: Make IT in Oradea, a startup incubator where I'm involved as a mentor, had its first in person pitching event in Timisoara. I was invited to be a part of the jury evaluating the submissions. I had never been to Timisoara before so it was I managed to scratch that itch of traveling to a new city. Unfortunately, I was not very impressed by the city, but at least I met some interesting people.

Where's rolisz?

In April, Today Software Magazine had its first in person event in two years and I presented briefly about modern NLP methods. The presentation was an exercise of my improvisation skills, because the platform used to show the slides had technical difficulties while I was speaking, so I had to wing it.

Where's rolisz?
All the speakers from The Developers conference

And then June was busy. I gave a talk at The Developers, the first in person technical conference in Cluj. I talked about Large Language Models. I listened to some great talks and I talked to some people who are also in the consulting business and got some great advice (and some not so encouraging things, such as the guy who signed a contract with a client 7 years after the client heard him speak at a conference).

And then the fun started: flying abroad again. It's been almost two years since I flew and even that was a short 50 minute flight. And it's been 3 years since I have been abroad (Debrecen doesn't really count).

Where's rolisz?
The OTH team at CodeX

Through Oradea Tech Hub I was invited to mentor at the CodeX hackathon in Riga. It was a really well organized hackathon. The venue was great and it was great to feel the energy from the students (mostly). I definitely wouldn't pull an all-nighter for a hackathon anymore :))

I have to admit I didn't know much about Riga before and I was very pleasantly surprised. It's not a very big city, but it has a cool vibe and it's very clean. 10/10 would recommend, I want to go there again with Roda and Gloria.

It has many buildings in the style of Art Nouveau, like Oradea (where I live), so I finally found out what Art Nouveau is! It also has a statue of St. Roland, so that might have contributed to me liking the city so much.

Where's rolisz?

The last trip in June was to Berlin, where I attended the datalift summit and I was a session chair. That was a really good conference! I met so many great people, all doing machine learning. I hope I can even collaborate with some of them sometime soon.

But the city itself... didn't impress me. I won't go again to Berlin anytime soon. The food was really good, with lots of very diverse options available, but the city itself... meh, too dirty. Sure, the city center is nice, but once you've seen one European capital, you've seen them all.

Where's rolisz?

Now it's time for a bit of relaxing, so now I'm in Sibiu (where I have grandparents). It's Gloria's first longer trip away from home, so we're curious how she'll handle it! So far so good, so we're hoping for at least two more trips in the next months!

]]>
<![CDATA[Roland tries new things: Horse riding]]>My two brothers in law are fans of horse riding and last week they invited the rest of the family to join them for a day of horse riding. They encouraged us with fun tales of how of them fell off/was thrown off a horse. Having not done anything

]]>
https://rolisz.ro/2022/06/09/horse-riding/651131708f188c0001983bfbThu, 09 Jun 2022 14:48:55 GMT

My two brothers in law are fans of horse riding and last week they invited the rest of the family to join them for a day of horse riding. They encouraged us with fun tales of how of them fell off/was thrown off a horse. Having not done anything new in a while, Roda and I agreed to it.

Roland tries new things: Horse riding
The 10 minute horse riding intro

When we arrived at the Kalota Lovarda, we received a brief intro to how to ride a horse while it walks and an even shorter intro on how to ride it while it trots. Turns out riding a horse is not a passive activity, but you have to move your hips to nudge the horse to keep moving, while also lightly pulling on the reins. If you let the reins go, the horse will most likely take a break to eat some grass. Horses really love eating grass and will do that whenever they have the chance.

Roland tries new things: Horse riding
My horse, Titi

The first five minutes of the intro had me regretting that I signed up for activity. Do this for the rest of the day? Oh boy, oh boy. Please don't fall of the horse! Please don't throw me off, horsey!

Roland tries new things: Horse riding
"Excited" to start our trip

It was an interesting experience where I learned a lot, such as the fact that horses have thick skin, so they don't care about going through bushes and thorns - but you do. Also, when going through a thicker forest, horses only make sure they don't hit trees - your knees are none of their concern, or worse, not even your head when going below some low branches.

Roland tries new things: Horse riding

The views were beautiful, as always, but honestly, I'm not sure if I'd rather do the trips on horses or while walking. There is a certain serenity that comes with the rhythm of horses step, which makes the scenery more enjoyable in a way. But it's so exhausting after 8 hours....

Also, horses have very different personalities and behaviors. Some horses have no problem pooping while walking, others, like my Titi, would always stop whenever he felt the need to relieve himself. Sometimes he would even go off the path, as a courtesy to the others. Some horses would avoid stepping into the mess others left, other horses had no problems marching straight through.

Roland tries new things: Horse riding

One of the funnier things on the journey was when the owner of the horses got off this horse for some reason and then his horse ran away. I think he spent 15 minutes trying to get back on him. It was quite funny, but it also made most of us reconsider getting off our horses.

When we got to the restaurant after 3 hours, everyone was relieved to finally get off the horse and give our bottoms a well deserved rest. But then it was time to go back, on a slightly different route. This time, even the horses were tired. The older ones would stop periodically for a quick break. But, after 2.5 hours, we did get back to the stables safely.

Roland tries new things: Horse riding
Tired horses, tired riders

Unfortunately, I never managed to reliably get my horse to trot and more importantly, I never managed to learn how I should move in the saddle while trotting, to avoid making an omelette. Maybe next time? But it won't be anytime soon. I'd rather go to the aquapark.

]]>