Compare commits

...

35 Commits

Author SHA1 Message Date
7516a9dc4a Add AbuseIPDB badge 2026-01-16 23:15:53 +01:00
05b17aec54 Update Tabi 2026-01-16 23:15:47 +01:00
1577063ff2 Add AbuseIPDB verification token 2026-01-16 22:20:19 +01:00
8c60a43ff2 Replace curl healthcheck with wget 2026-01-02 16:38:51 +01:00
7ed98c64c4 Write a retro on 2025 and resolutions for 2026 2026-01-02 16:37:15 +01:00
6f0c1630d6 Remove invalid syntax highlighting language 2026-01-02 16:36:59 +01:00
68d0c45414 Update homepage description 2026-01-02 16:36:46 +01:00
0dec45a603 Update Tabi and Docker deps 2026-01-02 16:36:27 +01:00
901a8d74d6 Remove BlueSky link 2025-11-11 11:50:34 +01:00
ab0d31cdcd Add mcfunction syntax file 2025-11-11 11:50:28 +01:00
c70ce3518b Remove T-rex image 2025-11-11 11:50:12 +01:00
4dc3ef60c0 Update static-web-server 2025-11-04 08:24:01 +01:00
8c0d6e04d9 Update Tabi 2025-11-02 13:18:49 +01:00
0fbf556ef5 Update subnetting post's description 2025-11-02 13:18:27 +01:00
553f7cc027 Release the subnet solver breakdown post's remaster 2025-11-02 13:05:28 +01:00
d493e93b96 Update landing page description 2025-10-25 08:31:00 +02:00
89fe820541 Fix typo 2025-10-25 08:04:57 +02:00
590e75deb7 Update site description 2025-10-22 16:18:53 +02:00
6e874b5960 Tweak the wording of my take on AI 2025-10-22 12:32:24 +02:00
6c68a7fe41 Publish AI usage policy post 2025-10-22 11:07:04 +02:00
5241cc25ba Fix various typos 2025-10-21 19:32:46 +02:00
d188c33a97 Update Tabi 2025-10-21 19:00:03 +02:00
95a70f6fa3 Add NWaCoT dino image 2025-10-08 08:07:20 +02:00
142a91df78 Revert to old banner 2025-10-04 14:06:41 +02:00
5b2903d468 Experiment with making the banner text slightly brighter 2025-10-04 14:00:32 +02:00
acd7768f81 Add health check 2025-10-04 09:23:56 +02:00
72d7452b63 Release Mood Tracker Spreadsheet Breakdown 2025-10-04 09:17:08 +02:00
2855e8d6f2 Use the alpine version of static-web-server image 2025-10-04 09:16:58 +02:00
8ec7752917 Update Tabi 2025-10-04 09:16:36 +02:00
f69da02250 Update intro to reflect the post releasing in late September 2025-09-28 19:29:37 +02:00
11b408a39b Add remote repository config 2025-09-26 15:59:34 +02:00
d8c7591b23 Update base URL 2025-09-26 15:48:42 +02:00
3d46094f87 Change Zola image version 2025-09-26 15:46:17 +02:00
e8116d3e3e Add Dockerfile 2025-09-26 15:27:08 +02:00
75ee0e5c1f Add CC BY-SA 4.0 license text file 2025-09-26 15:04:22 +02:00
14 changed files with 1753 additions and 37 deletions

15
Dockerfile Normal file
View File

@@ -0,0 +1,15 @@
FROM ghcr.io/getzola/zola:v0.21.0 AS zola
COPY . /project
WORKDIR /project
RUN ["zola", "build"]
FROM ghcr.io/static-web-server/static-web-server:2.40.1-alpine
ENV SERVER_HEALTH=true
WORKDIR /
COPY --from=zola /project/public /public
HEALTHCHECK \
--interval=10s \
--timeout=5s \
--start-period=3s \
--retries=3 \
CMD ["wget", "http://localhost/health", "-O", "/dev/null", "-q"]

428
LICENSE Normal file
View File

@@ -0,0 +1,428 @@
Attribution-ShareAlike 4.0 International
=======================================================================
Creative Commons Corporation ("Creative Commons") is not a law firm and
does not provide legal services or legal advice. Distribution of
Creative Commons public licenses does not create a lawyer-client or
other relationship. Creative Commons makes its licenses and related
information available on an "as-is" basis. Creative Commons gives no
warranties regarding its licenses, any material licensed under their
terms and conditions, or any related information. Creative Commons
disclaims all liability for damages resulting from their use to the
fullest extent possible.
Using Creative Commons Public Licenses
Creative Commons public licenses provide a standard set of terms and
conditions that creators and other rights holders may use to share
original works of authorship and other material subject to copyright
and certain other rights specified in the public license below. The
following considerations are for informational purposes only, are not
exhaustive, and do not form part of our licenses.
Considerations for licensors: Our public licenses are
intended for use by those authorized to give the public
permission to use material in ways otherwise restricted by
copyright and certain other rights. Our licenses are
irrevocable. Licensors should read and understand the terms
and conditions of the license they choose before applying it.
Licensors should also secure all rights necessary before
applying our licenses so that the public can reuse the
material as expected. Licensors should clearly mark any
material not subject to the license. This includes other CC-
licensed material, or material used under an exception or
limitation to copyright. More considerations for licensors:
wiki.creativecommons.org/Considerations_for_licensors
Considerations for the public: By using one of our public
licenses, a licensor grants the public permission to use the
licensed material under specified terms and conditions. If
the licensor's permission is not necessary for any reason--for
example, because of any applicable exception or limitation to
copyright--then that use is not regulated by the license. Our
licenses grant only permissions under copyright and certain
other rights that a licensor has authority to grant. Use of
the licensed material may still be restricted for other
reasons, including because others have copyright or other
rights in the material. A licensor may make special requests,
such as asking that all changes be marked or described.
Although not required by our licenses, you are encouraged to
respect those requests where reasonable. More considerations
for the public:
wiki.creativecommons.org/Considerations_for_licensees
=======================================================================
Creative Commons Attribution-ShareAlike 4.0 International Public
License
By exercising the Licensed Rights (defined below), You accept and agree
to be bound by the terms and conditions of this Creative Commons
Attribution-ShareAlike 4.0 International Public License ("Public
License"). To the extent this Public License may be interpreted as a
contract, You are granted the Licensed Rights in consideration of Your
acceptance of these terms and conditions, and the Licensor grants You
such rights in consideration of benefits the Licensor receives from
making the Licensed Material available under these terms and
conditions.
Section 1 -- Definitions.
a. Adapted Material means material subject to Copyright and Similar
Rights that is derived from or based upon the Licensed Material
and in which the Licensed Material is translated, altered,
arranged, transformed, or otherwise modified in a manner requiring
permission under the Copyright and Similar Rights held by the
Licensor. For purposes of this Public License, where the Licensed
Material is a musical work, performance, or sound recording,
Adapted Material is always produced where the Licensed Material is
synched in timed relation with a moving image.
b. Adapter's License means the license You apply to Your Copyright
and Similar Rights in Your contributions to Adapted Material in
accordance with the terms and conditions of this Public License.
c. BY-SA Compatible License means a license listed at
creativecommons.org/compatiblelicenses, approved by Creative
Commons as essentially the equivalent of this Public License.
d. Copyright and Similar Rights means copyright and/or similar rights
closely related to copyright including, without limitation,
performance, broadcast, sound recording, and Sui Generis Database
Rights, without regard to how the rights are labeled or
categorized. For purposes of this Public License, the rights
specified in Section 2(b)(1)-(2) are not Copyright and Similar
Rights.
e. Effective Technological Measures means those measures that, in the
absence of proper authority, may not be circumvented under laws
fulfilling obligations under Article 11 of the WIPO Copyright
Treaty adopted on December 20, 1996, and/or similar international
agreements.
f. Exceptions and Limitations means fair use, fair dealing, and/or
any other exception or limitation to Copyright and Similar Rights
that applies to Your use of the Licensed Material.
g. License Elements means the license attributes listed in the name
of a Creative Commons Public License. The License Elements of this
Public License are Attribution and ShareAlike.
h. Licensed Material means the artistic or literary work, database,
or other material to which the Licensor applied this Public
License.
i. Licensed Rights means the rights granted to You subject to the
terms and conditions of this Public License, which are limited to
all Copyright and Similar Rights that apply to Your use of the
Licensed Material and that the Licensor has authority to license.
j. Licensor means the individual(s) or entity(ies) granting rights
under this Public License.
k. Share means to provide material to the public by any means or
process that requires permission under the Licensed Rights, such
as reproduction, public display, public performance, distribution,
dissemination, communication, or importation, and to make material
available to the public including in ways that members of the
public may access the material from a place and at a time
individually chosen by them.
l. Sui Generis Database Rights means rights other than copyright
resulting from Directive 96/9/EC of the European Parliament and of
the Council of 11 March 1996 on the legal protection of databases,
as amended and/or succeeded, as well as other essentially
equivalent rights anywhere in the world.
m. You means the individual or entity exercising the Licensed Rights
under this Public License. Your has a corresponding meaning.
Section 2 -- Scope.
a. License grant.
1. Subject to the terms and conditions of this Public License,
the Licensor hereby grants You a worldwide, royalty-free,
non-sublicensable, non-exclusive, irrevocable license to
exercise the Licensed Rights in the Licensed Material to:
a. reproduce and Share the Licensed Material, in whole or
in part; and
b. produce, reproduce, and Share Adapted Material.
2. Exceptions and Limitations. For the avoidance of doubt, where
Exceptions and Limitations apply to Your use, this Public
License does not apply, and You do not need to comply with
its terms and conditions.
3. Term. The term of this Public License is specified in Section
6(a).
4. Media and formats; technical modifications allowed. The
Licensor authorizes You to exercise the Licensed Rights in
all media and formats whether now known or hereafter created,
and to make technical modifications necessary to do so. The
Licensor waives and/or agrees not to assert any right or
authority to forbid You from making technical modifications
necessary to exercise the Licensed Rights, including
technical modifications necessary to circumvent Effective
Technological Measures. For purposes of this Public License,
simply making modifications authorized by this Section 2(a)
(4) never produces Adapted Material.
5. Downstream recipients.
a. Offer from the Licensor -- Licensed Material. Every
recipient of the Licensed Material automatically
receives an offer from the Licensor to exercise the
Licensed Rights under the terms and conditions of this
Public License.
b. Additional offer from the Licensor -- Adapted Material.
Every recipient of Adapted Material from You
automatically receives an offer from the Licensor to
exercise the Licensed Rights in the Adapted Material
under the conditions of the Adapter's License You apply.
c. No downstream restrictions. You may not offer or impose
any additional or different terms or conditions on, or
apply any Effective Technological Measures to, the
Licensed Material if doing so restricts exercise of the
Licensed Rights by any recipient of the Licensed
Material.
6. No endorsement. Nothing in this Public License constitutes or
may be construed as permission to assert or imply that You
are, or that Your use of the Licensed Material is, connected
with, or sponsored, endorsed, or granted official status by,
the Licensor or others designated to receive attribution as
provided in Section 3(a)(1)(A)(i).
b. Other rights.
1. Moral rights, such as the right of integrity, are not
licensed under this Public License, nor are publicity,
privacy, and/or other similar personality rights; however, to
the extent possible, the Licensor waives and/or agrees not to
assert any such rights held by the Licensor to the limited
extent necessary to allow You to exercise the Licensed
Rights, but not otherwise.
2. Patent and trademark rights are not licensed under this
Public License.
3. To the extent possible, the Licensor waives any right to
collect royalties from You for the exercise of the Licensed
Rights, whether directly or through a collecting society
under any voluntary or waivable statutory or compulsory
licensing scheme. In all other cases the Licensor expressly
reserves any right to collect such royalties.
Section 3 -- License Conditions.
Your exercise of the Licensed Rights is expressly made subject to the
following conditions.
a. Attribution.
1. If You Share the Licensed Material (including in modified
form), You must:
a. retain the following if it is supplied by the Licensor
with the Licensed Material:
i. identification of the creator(s) of the Licensed
Material and any others designated to receive
attribution, in any reasonable manner requested by
the Licensor (including by pseudonym if
designated);
ii. a copyright notice;
iii. a notice that refers to this Public License;
iv. a notice that refers to the disclaimer of
warranties;
v. a URI or hyperlink to the Licensed Material to the
extent reasonably practicable;
b. indicate if You modified the Licensed Material and
retain an indication of any previous modifications; and
c. indicate the Licensed Material is licensed under this
Public License, and include the text of, or the URI or
hyperlink to, this Public License.
2. You may satisfy the conditions in Section 3(a)(1) in any
reasonable manner based on the medium, means, and context in
which You Share the Licensed Material. For example, it may be
reasonable to satisfy the conditions by providing a URI or
hyperlink to a resource that includes the required
information.
3. If requested by the Licensor, You must remove any of the
information required by Section 3(a)(1)(A) to the extent
reasonably practicable.
b. ShareAlike.
In addition to the conditions in Section 3(a), if You Share
Adapted Material You produce, the following conditions also apply.
1. The Adapter's License You apply must be a Creative Commons
license with the same License Elements, this version or
later, or a BY-SA Compatible License.
2. You must include the text of, or the URI or hyperlink to, the
Adapter's License You apply. You may satisfy this condition
in any reasonable manner based on the medium, means, and
context in which You Share Adapted Material.
3. You may not offer or impose any additional or different terms
or conditions on, or apply any Effective Technological
Measures to, Adapted Material that restrict exercise of the
rights granted under the Adapter's License You apply.
Section 4 -- Sui Generis Database Rights.
Where the Licensed Rights include Sui Generis Database Rights that
apply to Your use of the Licensed Material:
a. for the avoidance of doubt, Section 2(a)(1) grants You the right
to extract, reuse, reproduce, and Share all or a substantial
portion of the contents of the database;
b. if You include all or a substantial portion of the database
contents in a database in which You have Sui Generis Database
Rights, then the database in which You have Sui Generis Database
Rights (but not its individual contents) is Adapted Material,
including for purposes of Section 3(b); and
c. You must comply with the conditions in Section 3(a) if You Share
all or a substantial portion of the contents of the database.
For the avoidance of doubt, this Section 4 supplements and does not
replace Your obligations under this Public License where the Licensed
Rights include other Copyright and Similar Rights.
Section 5 -- Disclaimer of Warranties and Limitation of Liability.
a. UNLESS OTHERWISE SEPARATELY UNDERTAKEN BY THE LICENSOR, TO THE
EXTENT POSSIBLE, THE LICENSOR OFFERS THE LICENSED MATERIAL AS-IS
AND AS-AVAILABLE, AND MAKES NO REPRESENTATIONS OR WARRANTIES OF
ANY KIND CONCERNING THE LICENSED MATERIAL, WHETHER EXPRESS,
IMPLIED, STATUTORY, OR OTHER. THIS INCLUDES, WITHOUT LIMITATION,
WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR
PURPOSE, NON-INFRINGEMENT, ABSENCE OF LATENT OR OTHER DEFECTS,
ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT
KNOWN OR DISCOVERABLE. WHERE DISCLAIMERS OF WARRANTIES ARE NOT
ALLOWED IN FULL OR IN PART, THIS DISCLAIMER MAY NOT APPLY TO YOU.
b. TO THE EXTENT POSSIBLE, IN NO EVENT WILL THE LICENSOR BE LIABLE
TO YOU ON ANY LEGAL THEORY (INCLUDING, WITHOUT LIMITATION,
NEGLIGENCE) OR OTHERWISE FOR ANY DIRECT, SPECIAL, INDIRECT,
INCIDENTAL, CONSEQUENTIAL, PUNITIVE, EXEMPLARY, OR OTHER LOSSES,
COSTS, EXPENSES, OR DAMAGES ARISING OUT OF THIS PUBLIC LICENSE OR
USE OF THE LICENSED MATERIAL, EVEN IF THE LICENSOR HAS BEEN
ADVISED OF THE POSSIBILITY OF SUCH LOSSES, COSTS, EXPENSES, OR
DAMAGES. WHERE A LIMITATION OF LIABILITY IS NOT ALLOWED IN FULL OR
IN PART, THIS LIMITATION MAY NOT APPLY TO YOU.
c. The disclaimer of warranties and limitation of liability provided
above shall be interpreted in a manner that, to the extent
possible, most closely approximates an absolute disclaimer and
waiver of all liability.
Section 6 -- Term and Termination.
a. This Public License applies for the term of the Copyright and
Similar Rights licensed here. However, if You fail to comply with
this Public License, then Your rights under this Public License
terminate automatically.
b. Where Your right to use the Licensed Material has terminated under
Section 6(a), it reinstates:
1. automatically as of the date the violation is cured, provided
it is cured within 30 days of Your discovery of the
violation; or
2. upon express reinstatement by the Licensor.
For the avoidance of doubt, this Section 6(b) does not affect any
right the Licensor may have to seek remedies for Your violations
of this Public License.
c. For the avoidance of doubt, the Licensor may also offer the
Licensed Material under separate terms or conditions or stop
distributing the Licensed Material at any time; however, doing so
will not terminate this Public License.
d. Sections 1, 5, 6, 7, and 8 survive termination of this Public
License.
Section 7 -- Other Terms and Conditions.
a. The Licensor shall not be bound by any additional or different
terms or conditions communicated by You unless expressly agreed.
b. Any arrangements, understandings, or agreements regarding the
Licensed Material not stated herein are separate from and
independent of the terms and conditions of this Public License.
Section 8 -- Interpretation.
a. For the avoidance of doubt, this Public License does not, and
shall not be interpreted to, reduce, limit, restrict, or impose
conditions on any use of the Licensed Material that could lawfully
be made without permission under this Public License.
b. To the extent possible, if any provision of this Public License is
deemed unenforceable, it shall be automatically reformed to the
minimum extent necessary to make it enforceable. If the provision
cannot be reformed, it shall be severed from this Public License
without affecting the enforceability of the remaining terms and
conditions.
c. No term or condition of this Public License will be waived and no
failure to comply consented to unless expressly agreed to by the
Licensor.
d. Nothing in this Public License constitutes or may be interpreted
as a limitation upon, or waiver of, any privileges and immunities
that apply to the Licensor or You, including from the legal
processes of any jurisdiction or authority.
=======================================================================
Creative Commons is not a party to its public
licenses. Notwithstanding, Creative Commons may elect to apply one of
its public licenses to material it publishes and in those instances
will be considered the “Licensor.” The text of the Creative Commons
public licenses is dedicated to the public domain under the CC0 Public
Domain Dedication. Except for the limited purpose of indicating that
material is shared under a Creative Commons public license or as
otherwise permitted by the Creative Commons policies published at
creativecommons.org/policies, Creative Commons does not authorize the
use of the trademark "Creative Commons" or any other trademark or logo
of Creative Commons without its prior written consent including,
without limitation, in connection with any unauthorized modifications
to any of its public licenses or any other arrangements,
understandings, or agreements concerning use of licensed material. For
the avoidance of doubt, this paragraph does not form part of the
public licenses.
Creative Commons may be contacted at creativecommons.org.

View File

@@ -1,7 +1,7 @@
base_url = "https://maciejpedzich-preview.onrender.com"
base_url = "https://maciejpedzi.ch"
compile_sass = true
title = "Maciej Pędzich"
description = "Computer Science student @ PJAIT. F1 nerd since birth. Avid homelabber. House music aficionado. Visibly on the spectrum. Warsaw. he/him."
description = "F1 nerd since birth. Homelabbing enthusiast. House music aficionado. Visibly on the spectrum. Warsaw. he/him."
theme = "tabi"
default_language = "en"
taxonomies = [{ name = "tags", feed = true }]
@@ -12,6 +12,7 @@ generate_feeds = true
highlight_code = true
bottom_footnotes = true
highlight_theme = "css"
extra_syntaxes_and_themes = ["syntaxes"]
[extra]
skin = "custom"
@@ -33,7 +34,6 @@ socials = [
{ name = "Email", url = "mailto:contact@maciejpedzi.ch", icon = "email" },
{ name = "Gitea", url = "https://code.maciejpedzi.ch", icon = "gitea" },
{ name = "GitHub", url = "https://github.com/maciejpedzich", icon = "github" },
{ name = "Bluesky", url = "https://bsky.app/profile/maciejpedzi.ch", icon = "bluesky" },
{ name = "LinkedIn", url = "https://www.linkedin.com/in/maciejpedzich", icon = "linkedin" },
]
footer_menu = [
@@ -43,6 +43,11 @@ footer_menu = [
{ url = "/sitemap.xml", name = "Sitemap", trailing_slash = false },
]
copyright = "© $CURRENT_YEAR Maciej Pędzich $SEPARATOR Content on this website is available under the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0) license."
remote_repository_url = "https://code.maciejpedzi.ch/maciejpedzich/website"
remote_repository_git_platform = "gitea"
remote_repository_branch = "master"
show_remote_changes = true
show_remote_source = true
[extra.analytics]
service = "umami"

View File

@@ -8,4 +8,4 @@ max_posts = 5
header = { title = "Welcome to my website!", img = "images/mac.png", img_alt = "Mac" }
+++
My Polish friends call me Maciek, and international friends call me Mac. I currently live in Warsaw, where I also study Computer Science at [PJAIT](https://pja.edu.pl/en/). After hours, I'm a huge F1 nerd, an avid homelabber, as well as house music aficionado.
My Polish friends call me Maciek, and international friends call me Mac. I'm a huge F1 nerd, homelabbing enthusiast, as well as house music aficionado. This is where I document the inner-workings of my brain and passion projects.

View File

@@ -0,0 +1,464 @@
+++
title = "Cheesing a subnetting test with Python"
date = 2025-11-02
description = "Breaking down the Python version of a script I wrote in 2024 to blitz through subnetting tests for my computer networking uni course."
[taxonomies]
tags = ["Technical", "Python", "Networking"]
[extra]
katex = true
+++
## Background
Don't you hate it when you're supposed to perform a mundane task by hand, when you see an absurdly straightforward way of getting the computer to do pretty much all the dirty work for you?
That's exactly the kind of situation I found myself in last year, when our networking course tutor presented said manual approach, which made my head hurt due to the sheer number of subnets and memorisation of values for subnet mask octets.
I think it was reasonable for me to approach the test exactly like a software developer would: examining the problem, devising an algorithm to solve it, and of course translating said algorithm to code. In this post I'm looking to do that once again, only this time using Python to handle the last part.
[The original script](https://github.com/maciejpedzich/subnet-solver) was written in Rust, since I just so happened to be learning that language back then, and I thought it would be a neat little exercise to sharpen up my skills. You'll soon find out why porting it to Python turned out to be so beneficial and worthy of an article.
## Outlining the test formula
We're given a random IPv4 address along with the number of bits reserved for the host part of the base network - in other words, a random [CIDR (**C**lassless **I**nter-**D**omain **R**outing) notation](https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing#CIDR_notation). Then we have a comma-separated list of subnets to divide the base network into. Each entry is represented by the subnet's unique name (single uppercase letter), followed by a comma and the number of hosts that need to be connected to that network, with all these fields wrapped in parentheses.
Our first task is to determine the base network's address, subnet mask, broadcast address, and address pool size. Our second task is to divide the network such that the subnet with the largest pool size receives a chunk of the lowest addresses from the base pool, the second largest subnet gets a chunk that starts with the first address outside the previous pool, and so on until we've gone through all the entries.
If two subnets happen to have an identical address pool size, the tie is broken by falling back to alphabetical (or reverse alphabetical depending on the test version) order of their names.
## Explaining the algorithm by example
Suppose we have:
- a base CIDR of `2.137.69.4/20`
- a subnet list of `(P,510), (D,256), (H,2025), (W,873)`
- alphabetical tiebreak order
### Task 1
How do we go about doing the aforementioned tasks? The first one is insanely easy to cheese with Python, but I'm going to explain it step by step anyway, so that you get a better understanding of where the end result came from.
#### Address pool size
Let's tackle the address pool size first. Because the number on the right-hand side of the slash denotes the number of high-order (leftmost) bits that remain unchanged for each address in the pool, we can subtract it from 32 (IPv4 address's bit width) to obtain the number of bits that do change, or in other terms - identify a host on the network.
We can then raise 2 to the power of that difference to find out how many unique addresses we can assign using the number of bits specified in the exponent, which is precisely the address pool size we're looking for.
In our example case, the answer is: $$2^{32-20}=2^{12}=4096$$
#### Subnet mask
As I mentioned in the previous sub-subsection, the number to the right of the slash tells us how many high-order bits remain unchanged across all the addresses. Subnet mask is a number that has exactly as many of those bits set to 1, with all the remaining ones set to 0 in the binary representation.
In order to obtain an integer that will only have $n$ leftmost bits active, we can do some bit-shifting magic. The trick is to shift 1 by $n$ bits to the left and subtract 1 from the result to receive a number that has exactly $n$ least significant bits set to 1, and then shift that $32 - n$ bits to the left to move the active bits to the most significant positions.
The formula looks like like this: $$((1\ll{n})-1)\ll(32-n)$$
For $n=20$, it evaluates to $4,294,963,200$ (or $-4096$ as a signed integer), which in binary representation (grouped into octets) is: $$11111111.11111111.11110000.00000000$$
Looks good to me! Converting the octets back to decimal will give us the answer: $$255.255.240.0$$
#### Network address
Now we can find out what the network address is by performing a bitwise AND operation between the address from the CIDR and the subnet mask we've just determined. It will select the bits reserved for network part of the address and clear all the host bits.
$$\qquad\enspace\space00000010.10001001.01000101.00000100\newline\text{AND}\enspace11111111.11111111.11110000.00000000\newline\overset{\rule[2.25pt]{198pt}{0.1pt}}{\newline\qquad\enspace\space00000010.10001001.01000000.00000000}$$
The answer in decimal is: $$2.137.64.0$$
#### Broadcast address
For the final piece of the task 1 puzzle, we have to perform a bitwise OR operation between the network address we've just learned and the inverted subnet mask. This will activate all the previously cleared host bits, giving us the highest possible address in the process.
$$\qquad\enspace\space00000010.10001001.01000000.00000000\newline\text{OR}\quad\space00000000.00000000.00001111.11111111\newline\overset{\rule[2.25pt]{198pt}{0.1pt}}{\newline\qquad\enspace\space00000010.10001001.01001111.11111111}$$
The answer in decimal is: $$2.137.79.255$$
### Task 2
Alright, we've got everything we need to know about the base network, so we can move on to dividing it into smaller networks accordingly with the provided list. Just to remind you, the list is `(P,510), (D,256), (H,2025), (W,873)` and the tiebreak ordering by names is alphabetical.
#### Calculating subnets' address pool sizes
We can't do any sorting straightaway, because we first have to find out what each subnet's address pool size is. You may recall from the [address pool size section](#address-pool-size) that we raised 2 to the power of number of bits reserved for identifying hosts in our base network. This exponent is the key to our solution, but how can we get hold of it?
Ideally, we'd want to somehow do the inverse of exponentiation, where we'd plug in the number of addresses we need (all the hosts plus network and broadcast addresses) to this mysterious function, which would return the lowest integer exponent that generates a power of 2 greater than or equal to the input. The output has to be an integer, because we can't use a fraction of a bit - it's an atomic unit.
Luckily for us, some clever people came up with two handy mathematical tools called [logarithm](https://en.wikipedia.org/wiki/Logarithm) and [ceiling](https://en.wikipedia.org/wiki/Floor_and_ceiling_functions) functions. The former is responsible for doing the inverse exponentiation, whereas the latter rounds up the real result to the closest integer or leaves it alone if it's already an integer.
Putting it all together - given $n$ hosts, the minimum required size of a subnet's address pool can be determined using this formula: $$\large{2^{\lceil{\log_2(n+2)}\rceil}}$$
Applying it to our example subnets gives us the following:
- **P**: $2^{\lceil{\log_2(512)}\rceil}=2^9=512$
- **D**: $2^{\lceil{\log_2(258)}\rceil}=2^9=512$
- **H**: $2^{\lceil{\log_2(2027)}\rceil}=2^{11}=2048$
- **W**: $2^{\lceil{\log_2(875)}\rceil}=2^{10}=1024$
#### Sorting entries and subnetting base network
We can finally order the subnets, assign them appropriate address chunks, and learn their properties (i.e. network address, subnet mask, and broadcast address). It's clear that H and W will come on top by their respective sizes alone, with P and D being tied on 512 addresses each. Since we're meant to break ties by using the alphabetical sorting of names, D will be placed before P.
From here we can follow very similar steps to those we took with the base network when it comes to deriving the aforementioned properties. H network's address will be the same as the base one. The subnet mask can be obtained by subtracting the power we raised 2 to for the pool size from 32 and using the same formula I showed in [this section](#subnet-mask). Similarly, the [broadcast address](#broadcast-address) requires the same bitwise OR operation between the network address and the inverted subnet mask.
The next subnet's network address is essentially the previous one's broadcast address incremented by 1, and I'm pretty sure you get the idea from here, so let's skip ahead to the final result:
{% wide_container() %}
| Name | Pool size | Network address | Subnet mask | Broadcast address |
| ---- | --------- | --------------- | ------------- | ----------------- |
| H | 2048 | 2.137.64.0 | 255.255.248.0 | 2.137.71.255 |
| W | 1024 | 2.137.72.0 | 255.255.252.0 | 2.137.75.255 |
| D | 512 | 2.137.76.0 | 255.255.254.0 | 2.137.77.255 |
| P | 512 | 2.137.78.0 | 255.255.254.0 | 2.137.79.255 |
{% end %}
## Implementing the algorithm
That's it for the theoretical part of this article. We can, at long last, write some code!
### Accepting and validating arguments
Let's say we want to pass the test parameters as command-line arguments in order:
1. base subnet's CIDR
2. comma-separated subnet list
3. tiebreak ordering (`"A->Z"` for alphabetical and `"Z->A"` for reverse alphabetical, because that's how they were denoted on my test).
We have to ensure that:
- we provide exactly 3 parameters
- the list is written as specified in the [test formula specification](#outlining-the-test-formula)
- the tiebreak ordering marker is one of the aforementioned strings
If any of the above checks fails, we should display an appropriate error message and exit the script with a failure status code (I'll keep using 1, but you can use different non-zero status codes if you're that keen on specifying the cause of failure).
Here's how we can code up this validation mechanism:
```python
import re
import sys
if len(sys.argv) != 4:
print(
"You have to provide exactly 3 arguments (in order):",
"",
"1. Base subnet's CIDR, eg. 123.45.67.89/10",
'2. Comma-separated list of subnets with their name character and minimum number of hosts, eg. "(A,12), (B,34), (C,56)"',
'3. "A->Z" to order subnets with the same pool sizes alphabetically, or "Z->A" to use reverse alphabetical order',
sep="\n",
file=sys.stderr
)
exit(1)
elif not re.fullmatch(r"^(?:\([A-Z],\d+\)(?:,\s*|$))+", sys.argv[2]):
print("Invalid subnet list format!", file=sys.stderr)
exit(1)
elif sys.argv[3] not in {"A->Z", "Z->A"}:
print("Invalid order marker!", file=sys.stderr)
exit(1)
```
Although checking the size of the list of arguments against 4 may seem like a typo at first glance, you have to keep in mind that the first string is the name of the script being executed, so _the actual parameters_ go from index 1 onwards.
As for the regex in the second if branch, it's not as complex or scary as it looks. Let's break it down:
- `^` (caret) matches the start of the string
- `(?:\([A-Z],\d+\)(?:,\s*|$))` marks a non-capturing group, where we match:
- `\([A-Z],\d+\)` an opening bracket, an uppercase letter from A to Z, a comma, at least one digit, and a closing bracket
- `(?:,\s*|$)` another non-capturing group with an alternative between:
- `,\s*` a comma followed by zero or more whitespace characters
- `$` the end of the string
- `+` matches at least one occurrence of the whole preceding group
> _But what about validating the CIDR notation?_
I hear you ask. Remember how I mentioned the cheesing potential of Python that makes this version of the script feel so overpowered? Check this out:
```python
from ipaddress import (
AddressValueError,
IPV4LENGTH,
IPv4Network,
NetmaskValueError
)
# ...
try:
base_network = IPv4Network(sys.argv[1], strict=False)
except AddressValueError:
print("Invalid base IPv4 address!", file=sys.stderr)
exit(1)
except NetmaskValueError:
print("Invalid base subnet mask bit count!", file=sys.stderr)
exit(1)
```
As much as this `try/except` block doesn't appear unusual, I'd like to draw your attention to the `strict=False` keyword argument. It will tell the `IPv4Network` constructor to extract the network address from a CIDR that might contain an address of a host within the subnet instead of raising a `ValueError`. And of course, we also get the subnet mask and broadcast address calculated for us.
So, was bringing up all the bitwise shenanigans in vain? I don't think so, because you should now be able to port the script to a language that doesn't offer the convenience of a similar class.
> _OK, but what if there aren't enough addresses in the base pool for all the subnets?_
Good question, we'll come back to it shortly.
### Extracting and sorting subnet entries
Next up we have going through and sorting all the entries provided in the second parameter. Here's how I've tackled the implementation:
```python
from math import ceil, log2
# ...
subnet_entries = sorted(
map(
lambda match: (
match.group(1),
ceil(log2(int(match.group(2)) + 2))
),
re.finditer(r"\(([A-Z]),(\d+)\)", sys.argv[2])
),
key=lambda entry: (
-entry[1],
entry[0] * (1 if sys.argv[3] == "A->Z" else -1)
)
)
```
We kick things off by extracting all the entries into an iterator using a portion of our previous regular expression, except this time we capture both the subnet's name and number of hosts to connect for easier access to each field's raw value.
We can get away with calculating the exponent without raising 2 to that power and sorting by the former alone, because we're dealing with the same base. The sorting key should be self-explanatory, so let's move on.
### Creating actual subnets for each entry
Before we get down to the subnetting business, we'll have to come back to the deferred question about validating the base pool size against the total subnets' pool sizes. For the former we can leverage the [formula I introduced earlier](#address-pool-size), whereas for the latter we can sum 2s raised to the recently calculated powers.
```python
base_addr_pool_size = 2 ** (IPV4LENGTH - base_network.prefixlen)
total_subnets_pool_size = sum(
map(
lambda entry: 2 ** entry[1],
subnet_entries
)
)
if total_subnets_pool_size > base_addr_pool_size:
print(
"The total size of provided subnets exceeds that of the base address pool!",
file=sys.stderr
)
exit(1)
```
Alright, it's subnetting time:
```python
table_rows: list[tuple[str, int, str, str, str]] = []
for name, suffixlen in subnet_entries:
prefixlen = IPV4LENGTH - suffixlen
subnet = IPv4Network(
(base_network.network_address, prefixlen)
)
table_rows.append(
(
name,
2 ** suffixlen,
str(subnet.network_address),
str(subnet.netmask),
str(subnet.broadcast_address)
)
)
base_network = IPv4Network(
int(subnet.broadcast_address) + 1
)
```
Once again, no trace of bitwise black magic thanks to the power of the `IPv4Network` class. It's also worth pointing out that it's perfectly acceptable to pass an address to its constructor without specifying the subnet mask's bit count like in that `base_network` reassignment. In such case, the constructor implicitly sets it to 32.
### Pretty-printing results
And last but not least, we need to output the results of our little script. Let's start by printing the key details about our base network between the pool size check and the `table_rows` declaration:
```python
print(
f"Base network address: {base_network.network_address}",
f"Base subnet mask: {base_network.netmask}",
f"Base broadcast address: {base_network.broadcast_address}",
"",
f"Base address pool size: {base_addr_pool_size:,}",
f"Total subnets' pool size: {total_subnets_pool_size:,}",
"",
sep="\n"
)
```
To top it all off with a neat ASCII table after looping through `subnet_entries`:
```python
row_separator = f"+{"-"*6}+{"-"*15}+{"-"*17}+{"-"*17}+{"-"*19}+"
print(
row_separator,
"| Name | Pool size | Network address | Subnet mask | Broadcast address |",
"=" * len(row_separator),
sep="\n"
)
for row in table_rows:
print(
f"|{row[0]:<6}|{row[1]:<15,}|{row[2]:<17}|{row[3]:<17}|{row[4]:<19}|",
row_separator,
sep="\n"
)
```
Moment of truth:
```
> python subnets.py 2.137.69.4/20 "(P,510), (D,256), (H,2025), (W,873)" "A->Z"
Base network address: 2.137.64.0
Base subnet mask: 255.255.240.0
Base broadcast address: 2.137.79.255
Base address pool size: 4,096
Total subnets' pool size: 4,096
+------+---------------+-----------------+-----------------+-------------------+
| Name | Pool size | Network address | Subnet mask | Broadcast address |
================================================================================
|H |2,048 |2.137.64.0 |255.255.248.0 |2.137.71.255 |
+------+---------------+-----------------+-----------------+-------------------+
|W |1,024 |2.137.72.0 |255.255.252.0 |2.137.75.255 |
+------+---------------+-----------------+-----------------+-------------------+
|D |512 |2.137.76.0 |255.255.254.0 |2.137.77.255 |
+------+---------------+-----------------+-----------------+-------------------+
|P |512 |2.137.78.0 |255.255.254.0 |2.137.79.255 |
+------+---------------+-----------------+-----------------+-------------------+
```
Yay, we've done it!
## Complete source code
```python,linenos,name=subnets.py
# (c) 2025 Maciej Pędzich
# Released under the CC BY-SA 4.0 license:
# https://creativecommons.org/licenses/by-sa/4.0
from ipaddress import (
AddressValueError,
IPV4LENGTH,
IPv4Network,
NetmaskValueError
)
from math import ceil, log2
import re
import sys
if len(sys.argv) != 4:
print(
"You have to provide exactly 3 arguments (in order):",
"",
"1. Base subnet's CIDR, eg. 123.45.67.89/10",
'2. Comma-separated list of subnets with their name character and minimum number of hosts, eg. "(A,12), (B,34), (C,56)"',
'3. "A->Z" to order subnets with the same pool sizes alphabetically, or "Z->A" to use reverse alphabetical order',
sep="\n",
file=sys.stderr
)
exit(1)
elif not re.fullmatch(r"^(?:\([A-Z],\d+\)(?:,\s*|$))+", sys.argv[2]):
print('Invalid subnet list format!', file=sys.stderr)
exit(1)
elif sys.argv[3] not in {"A->Z", "Z->A"}:
print('Invalid order marker!', file=sys.stderr)
exit(1)
try:
base_network = IPv4Network(sys.argv[1], strict=False)
except AddressValueError:
print("Invalid base IPv4 address!", file=sys.stderr)
exit(1)
except NetmaskValueError:
print("Invalid base subnet mask bit count!", file=sys.stderr)
exit(1)
subnet_entries = sorted(
map(
lambda match: (
match.group(1),
ceil(log2(int(match.group(2)) + 2))
),
re.finditer(r"\(([A-Z]),(\d+)\)", sys.argv[2])
),
key=lambda entry: (
-entry[1],
entry[0] * (1 if sys.argv[3] == "A->Z" else -1)
)
)
base_addr_pool_size = 2 ** (IPV4LENGTH - base_network.prefixlen)
total_subnets_pool_size = sum(
map(
lambda entry: 2 ** entry[1],
subnet_entries
)
)
if total_subnets_pool_size > base_addr_pool_size:
print(
"The total size of provided subnets exceeds that of the base address pool!",
file=sys.stderr
)
exit(1)
print(
f"Base network address: {base_network.network_address}",
f"Base subnet mask: {base_network.netmask}",
f"Base broadcast address: {base_network.broadcast_address}",
"",
f"Base address pool size: {base_addr_pool_size:,}",
f"Total subnets' pool size: {total_subnets_pool_size:,}",
"",
sep="\n"
)
table_rows: list[tuple[str, int, str, str, str]] = []
for name, suffixlen in subnet_entries:
prefixlen = IPV4LENGTH - suffixlen
subnet = IPv4Network(
(base_network.network_address, prefixlen)
)
table_rows.append(
(
name,
2 ** suffixlen,
str(subnet.network_address),
str(subnet.netmask),
str(subnet.broadcast_address)
)
)
base_network = IPv4Network(
int(subnet.broadcast_address) + 1
)
row_separator = f"+{"-"*6}+{"-"*15}+{"-"*17}+{"-"*17}+{"-"*19}+"
print(
row_separator,
"| Name | Pool size | Network address | Subnet mask | Broadcast address |",
"=" * len(row_separator),
sep="\n"
)
for row in table_rows:
print(
f"|{row[0]:<6}|{row[1]:<15,}|{row[2]:<17}|{row[3]:<17}|{row[4]:<19}|",
row_separator,
sep="\n"
)
```

View File

@@ -11,7 +11,7 @@ tags = ["Personal"]
I've done it!
Here's my first post of 2025 - better late than never, I guess. Although the first quarter was actually quite decent, the second quarter was anything but. Yet as we're heading into the final month of the thrid quarter, I'm somewhat grateful for Q2 turning out the way it did. Q2 kicked off a series of events that ultimately helped me understand myself better and improve my mood in the long run.
Here's my first post of 2025 - better late than never, I guess. Although the first quarter was actually quite decent, the second quarter was anything but. Yet as we're approaching Q4, I'm somewhat grateful for Q2 turning out the way it did. Q2 kicked off a series of events that ultimately helped me understand myself better and improve my mood in the long run.
But let's rewind to January.
@@ -33,7 +33,7 @@ The answer is... _drumroll_
### April
Mood swings and consistent lack of motivation to do anything productive apart from uni assignments. Seemingly overnight, I would frequently find myself just collapsing on my bed shortly after coming back home, often accompanied by strong headaches. I wasn't struggling academically, the workload wasn't overhwelming, but I just felt so fucking done with all that bullshit.
Mood swings and consistent lack of motivation to do anything productive apart from uni assignments. Seemingly overnight, I would frequently find myself just collapsing on my bed shortly after coming back home, often accompanied by strong headaches. I wasn't struggling academically, the workload wasn't overwhelming, but I just felt so fucking done with all that bullshit.
{{ admonition(type="note", text="While I generally refrain from cursing in blog posts, the last sentence seems fitting as an illustration of my overall mood at that point in time.") }}
@@ -75,11 +75,11 @@ The swing continues to move up and down, but increasingly more violently.
#### Test, test, test
Another recommendation from the doctor was doing a blood test, because my diet had been _very_ restrictive since... well, my very early childhood. To this day, it consists exclusively of processed food, because I can't stand the appearance and smell of pretty much all the _normal_ dishes out there. Despite that, I'm actually a little underweight, because my apetite is quite low most of the time.
Another recommendation from the doctor was doing a blood test, because my diet had been _very_ restrictive since... well, my very early childhood. To this day, it consists exclusively of processed food, because I can't stand the appearance and smell of pretty much all the _normal_ dishes out there. Despite that, I'm actually a little underweight, because my appetite is quite low most of the time.
But anyway, the results indicated plenty of vitamin deficiencies. Shocking, I know.
I also took some autism screening tests to further validate my suspicions and potentially bring up the results to the doctor doing my proper diagnosis. Here's a list of the assesments I completed, along with my scores. By the way, shout-out to [Embrace Autism](https://embrace-autism.com) for the amazing breakdowns of these tests and automated scoring tools.
I also took some autism screening tests to further validate my suspicions and potentially bring up the results to the doctor doing my proper diagnosis. Here's a list of the assessments I completed, along with my scores. By the way, shout-out to [Embrace Autism](https://embrace-autism.com) for the amazing breakdowns of these tests and automated scoring tools.
- [**Autism Spectrum Quotient (AQ)**](https://embrace-autism.com/autism-spectrum-quotient)
- Score: 45
@@ -102,7 +102,7 @@ Every single score indicated a great likelihood of me being on the spectrum, whi
#### A new low
Unfortunately, I started having those internal visions (not hallucinations) of me throwing myself off a bridge into the Vistula river. I would also experience heightened anxiety in public places, unusually low apetite, difficulties falling asleep, as well as a general sense of being a waste of space. I stopped tinkering with my homelab, I left pretty much every Discord I was active in, I deleted some social media accounts... I was gradually withdrawing from life.
Unfortunately, I started having those internal visions (not hallucinations) of me throwing myself off a bridge into the Vistula river. I would also experience heightened anxiety in public places, unusually low appetite, difficulties falling asleep, as well as a general sense of being a waste of space. I stopped tinkering with my homelab, I left pretty much every Discord I was active in, I deleted some social media accounts... I was gradually withdrawing from life.
At that point I hadn't made any concrete arrangements or resorted to self-harm. I'm quite anxious when handling needles or sharp blades, and while we're not talking outright fear of these objects, I'm definitely afraid of accidentally cutting myself, let alone doing that deliberately.
@@ -110,7 +110,7 @@ Anyhow, this was no longer burnout. This was depression.
#### The second best time is now
In May, I foolishly declined getting prescription meds, still trying to believe it was all just a temporary low. But this time around I had no choice, but to start taking antidepressants. I also got some antianxiety pills, sleeping pills, and vitamin suppliments. Of course, they wouldn't start working overnight, and I was still feeling directionless with regards to my future.
In May, I foolishly declined getting prescription meds, still trying to believe it was all just a temporary low. But this time around I had no choice, but to start taking antidepressants. I also got some antianxiety pills, sleeping pills, and vitamin supplements. Of course, they wouldn't start working overnight, and I was still feeling directionless with regards to my future.
It was a step forward, though.
@@ -163,7 +163,7 @@ The strangest thing about these arrangements, is that there was no single major
The 17th of July 2025 marked the 10th anniversary of [Jules Bianchi](https://en.wikipedia.org/wiki/Jules_Bianchi)'s death, as well as the 30th anniversary of [Juan Manuel Fangio](https://en.wikipedia.org/wiki/Juan_Manuel_Fangio)'s death. If that's not unfortunate enough, this date comes [13](https://en.wikipedia.org/wiki/13_(number)#Bad) days after the 4th of July.
{% end %}
Anyway, I set the date to the 1st of October 2025. Before heading out to the Warsaw Central train station, I wanted to send personalised goodbye videos to 10 of my closest friends from various circles between 7 AM and 10 AM depending on when I'd wake up. These videos would feature me explaining why they earned the questionble honour and why I was peacing out for good, and obviously thanking them for everything.
Anyway, I set the date to the 1st of October 2025. Before heading out to the Warsaw Central train station, I wanted to send personalised goodbye videos to 10 of my closest friends from various circles between 7 AM and 10 AM depending on when I'd wake up. These videos would feature me explaining why they earned the questionable honour and why I was peacing out for good, and obviously thanking them for everything.
What would follow is a walk from my flat to platform no. 3 or 4 of the train station. From there on, it would be a matter of waiting for the first train that would just so happen to be passing by, before I'd dash to the tracks and jump. Thankfully, this is as fleshed-out as these plans ever got.
@@ -195,7 +195,7 @@ My request to switch modes got accepted by the dean, so I'm really looking forwa
## Q4 2025 and beyond - expressing myself more
I hope to allocate more time towards working on my personal projects and this blog. The former will see a mix of me experimenting more with Python and the dataviz side of things, as well as revisiting some of my already released apps that I believe deserve more love.
I hope to allocate more time towards working on my personal projects and this blog. The former will see a mix of me experimenting more with Python and the datavis side of things, as well as revisiting some of my already released apps that I believe deserve more love.
The latter will see an introduction of a monthly series that's going to run at least until January 2026: mood report with both a statistical and IRL summary of the last 4 weeks. I'll also include a technical writeup on the mood spreadsheet itself, because it got me to write some rather interesting formulas as a fun little challenge.

View File

@@ -0,0 +1,54 @@
+++
title = "How I (hardly ever) use AI"
date = 2025-10-22
description = "Outlining my AI usage philosophy by answering some questions for my friend."
[taxonomies]
tags = ["Random", "AI"]
+++
A couple days ago, [my friend Ryan](https://ryantrimble.com) asked fellow developers [on Bluesky](https://bsky.app/profile/ryantrimble.com/post/3m3n2vopg322g) about their approach to using AI-powered tools and LLMs for work. I thought I'd use that opportunity to prepare my personal AI statement of sorts, especially since Ryan's given me a complete list of questions for me to reply to in full.
## Are you dabbling with it, or are you making full use of it in your work?
No, I don't really dabble with (let alone utilise) it on a regular basis. There is one very specific use-case I can think of, where AI and LLMs can indeed prove quite useful, but I'll elaborate on that in a later section.
## Is your work part of a collaborative team? Any impact there?
Practically every single line of code I've written so far is a part of a solo project of mine, therefore I can't say I've been impacted by AI's output there.
## Has it inspired you to build bigger and better things than you may have found difficult or impossible without it?
Not at all, because I've got enough confidence in my own set of technical skills to build virtually any project myself. Should I need to expand that set, I've got a wide range of significantly more reliable resources at my disposal - official documentation, books, videos, courses, forums, you name it.
## How does it feel to give up control to a virtual agent? Does it require a different mindset or mental model than when youre handwriting the code yourself?
All the aforementioned confidence goes out of the window straightaway. I feel like I constantly need to double-check the output and point out all the mistakes I've found or just rewrite the whole prompt, which gets more tedious as the complexity of a given task increases.
I reckon it's more productive to tackle the problem head on with the power of some of the resources I listed earlier and maybe a debugger on your side. That way not only do you manage to solve it, but also gain a proper understanding of its root cause.
## Whats been good about it? What benefits have you gained from it?
Here's where I have to give LLMs some credit. I believe they do a rather good job when it comes to performing simple transformations over numerous lines of text that would quickly prove cumbersome when done manually.
I remember having to convert a [Rust enum](https://doc.rust-lang.org/rust-by-example/custom_types/enum.html) to a module of [constants](https://doc.rust-lang.org/rust-by-example/custom_types/constants.html), because some of the new entries I needed to introduce had their values conflict with existing ones, but the enum in question contained well over a hundred options.
This is where I've opted to delegate the conversion to an LLM, since doing it by hand would be much more time-consuming and typo-prone. As much as their generative capabilities are way too hit-or-miss for my liking, I've found their transformative skills reliable enough for me to apply them on large amounts of text.
## Whats been difficult about it? What drawbacks have you experienced. Do those drawbacks outweigh the benefits?
I think I've already answered the first two questions, so I'll focus on the last one. The fact that an LLM's output is only designed to pass off as written by a human without providing any guarantees of its factual correctness makes it a huge deal-breaker for me.
## Whats your overall take? Does it help you enjoy your work more, or less? Does it make you more or less productive?
Using an LLM to build a project or solve a programming problem for me feels dishonest. It takes away all the fun from doing these two things, which is completely missing the point.
And like I said earlier, having to proofread the reply to my prompt, editing the prompt to provide feedback, and so on until eventually landing at a decent enough solution... that seems quite counterproductive to me.
## Does it change the way you view yourself as a developer?
Some claim that LLMs are bound to replace us, but judging by all the vibe-coding horror stories I've heard so far, I might as well join the ever-growing market of _cleanup specialists_, where my prescription would always be tearing down the sorry mess of an app/website/game/whatever and writing one from scratch.
## How do you plan to use it going forward?
I'll stick with employing LLMs to do the dirty work of text transformation if necessary. I can handle writing actual code/articles and problem-solving just fine.

View File

@@ -0,0 +1,91 @@
+++
title = "Looking back on 2025 and ahead to 2026"
date = 2026-01-02
description = "(TW: suicide) Reflecting on the most challenging year for me so far and setting goals for the upcoming one."
[taxonomies]
tags = ["Personal"]
+++
{{ admonition(type="warning", title="TRIGGER WARNING", text="This post contains mentions of suicide.") }}
If you told me [six months ago](/blog/hello-again-world/#the-bad-and-ugly) that I'd actually make it to 2026, I probably wouldn't believe you. I hit the rock bottom mentally and I was genuinely preparing to quit altogether. But then my prescription meds finally started working their magic and when I published that blog post in late September, I thought it was all behind me.
## Final quarter of 2025
Spoiler alert: it wasn't.
### Pulling the trigger... on uni
When the new academic year kicked off in October, the burnout came back in full swing. When I made that decision to switch from learning onsite to online in August, I felt like I had no other choice if I wanted to stay on my own in Warsaw. Even though I wanted to drop out back then, I didn't feel like I had a convincing plan B that would satisfy my parents (I was and still am financially dependent on them).
Yet with each day I'd try to force myself to do any of the course assignments, the burnout just kept getting stronger. Soon enough I'd ended up going back to square one, reconsidering suicide and making all the progress I'd made over the past few months seemingly evaporate.
I was effectively faced with two scenarios:
1. Desperately attempt to finish the final 3 semesters and either burn yourself out irreversibly or follow through with the one-way trip to the train station
2. Drop out and come up with a strategy to become financially independent, but still have your parents bankroll you in the interim
As you may have guessed, I went with the second option. So, what was the outcome?
I got my parents' blessing to stay in Warsaw and adopt the aforementioned strategy! I'll outline it in a later section, but now let's look at two more highlights from the end of 2025.
### Docker + Packwiz Minecraft server template
In October I launched a Minecraft server (Java Edition, version 1.19.2) for my friends on a Discord server I'm active in. It used the [Raspberry Flavoured 3.1 modpack](https://www.curseforge.com/minecraft/modpacks/raspberry-flavoured/files/7026734) as its base along with a bunch of other quality-of-life and Discord integration mods.
To streamline the management of all the mods and server configuration changes, I made a Docker Compose project with said Minecraft server and a private [Packwiz](https://packwiz.infra.link/) registry that the former reaches out to in order to install and/or update mods on startup.
In a typical developer fashion, I kept track of all changes using a Git repository, which also comes with a pre-commit and a post-commit hook that updates the registry's file hashes and makes sure none of the Docker-ignored files accidentally end up in the registry's HTTP server.
I liked this setup so much that I later created a generic project template with a Bash setup script that lets you initialise the Packwiz registry and creates the Git hooks in the repo's `.git/hooks` directory. You can find the source code on [my Gitea instance](https://code.maciejpedzi.ch/maciejpedzich/mc-server-packwiz-docker) or on [GitHub](https://github.com/maciejpedzich/mc-server-packwiz-docker).
### Musical Secret Santa
In September, I started this monthly _Musical Secret Santa_ on the same Discord server. The premise is very simple: you're assigned a random person, for whom you have to create a playlist or a mixtape following a theme given by that person.
As more people joined the fun in subsequent editions, it became increasingly more difficult to keep track of everyone's requests, preferred streaming platforms, as well as previous draws to avoid back-to-back repeats.
To address all these issues and effectively keep track of all that data in one place, I launched a [dedicated website](https://mss.cuppa.town) in December. It's proudly self-hosted and powered by TypeScript, Express, and SQLite (among others).
At the time of writing, I haven't got round to preparing a README for this project yet, but in case you want to host it yourself already, you can examine the [source code](https://code.maciejpedzi.ch/maciejpedzich/secret-santa) on my Gitea instance, especially `src/env.d.ts` for all the required environment variables.
## Resolutions for 2026
I have two main goals for the new year:
1. Land a DevOps/SysAdmin job
2. Work on a DJing side project
Here's what I'm going to do to achieve them.
### Land a DevOps/SysAdmin job
Given the current job market landscape and the fact that I've given up on pursuing a CS degree, this goal will definitely take a lot of effort from my end, but it's far from impossible.
First of all, I need to do a detailed tour of the homelab I've been running for a year and a half at this point. It has seen plenty of structural changes since its introduction and my last post I did about it on my website back in 2024, so it's high time I revisited it and greatly expanded on every single area, so that I can showcase my pretty decent on-premise infrastructure experience. The deadline for this subtarget is March
Second of all, I believe I have to start sharing my articles more often on LinkedIn. As much as I despise this hellsite, it's a necessary evil with my current (un)employment situation. And last but not least, I'm looking to obtain two certificates:
1. [AWS Certified Cloud Practitioner](https://aws.amazon.com/certification/certified-cloud-practitioner) by May
2. [AWS Certified Solutions Architect](https://aws.amazon.com/certification/certified-solutions-architect-associate) or [AWS Certified Developer Associate](https://aws.amazon.com/certification/certified-developer-associate) by September
In order to prepare for these certificates as best as I can, I'll use AWS's e-learning resources and mock exams, based on which I'll prepare a practice routine for myself to follow ahead of taking the real deal.
With all these pieces in place, I should be ready to start applying for various job openings. I'm targeting hybrid openings if a given company has an office in Warsaw, and fully remote otherwise. I hope I'll be able to find something by the end of this year, but I'm not concerned about my financial situation as long as my parents keep sponsoring me.
### Work on a DJing side project
Now, I have a confession. Working in IT wasn't actually my first career choice - it was actually being an electronic music DJ and producer. However, I've always feared the rejection of this idea from my parents, especially given the higher barrier of entry compared to programming.
For the record (hah), it's not that I've never had a passion for coding, it's just that I've always had those dreams of getting into the music industry, but never had the courage to actually do anything about it, even on the side.
But now that I've put together some mixtapes for the whole Secret Santa event and I find them decent enough to share beyond my closest friends, I want to finally give it a proper shot. I've chosen to go by the alias _Mike Alpha Charlie_ and even registered the `mikealphacharlie.com` domain name.
I'm planning to launch this project's website and share my very first mix later this month on SoundCloud and YouTube. I'll also need to start posting more on Instagram and maybe Threads as well. It might go somewhere, or it might remain a side thing forever, but at least I'm going to try for once instead of keeping it in the dreams' realm for eternity.
## Final thoughts
2025 was without a doubt one hell of a difficult year for me. But it was also one of a breakthrough in terms of the ASD diagnosis and dropping out of uni. I started following my own direction and I still have the backing of my family (not just in the financial sense) and friends.
I've realised that I matter for a lot of people, so I'm glad I haven't taken the easy way out prematurely and I'm forever grateful for their continued support. The road ahead will certainly have plenty of bumps, but I feel better-equipped to handle them this time.

View File

@@ -0,0 +1,335 @@
+++
title = "Mood Tracker Spreadsheet Breakdown"
date = 2025-10-04
description = "Analysing a spreadsheet I made to analyse my mood."
[taxonomies]
tags = ["Technical", "Google Sheets"]
+++
## Background and live demo
If you've read [my previous post](/blog/hello-again-world), then you probably remember me mentioning a Google Sheets spreadsheet I created back in July to keep track of my mood changes before my control visit in August.
Since I've really enjoyed this little experiment so far (with both my psychiatrist and my therapist encouraging me to continue it), I thought I'd share the inner workings of the latest edition of the spreadsheet ahead of the first monthly mood report post coming towards the end of this month.
In case you just want to copy the spreadsheet to track your own mood or look under the hood yourself, I've prepared a [live demo with random data](https://docs.google.com/spreadsheets/d/1lxI0PVTUZBtG8UaloNwUohoybKL0SAKA8mEEseSIfpk/edit?usp=sharing) you can copy to your account and play around with to your heart's content.
## Data structure
Because I'm supposed to schedule my next control visit for January next year and I'm too lazy to create separate spreadsheets for each month, I've opted to split these 4 or so months into 4 seasons of 4 weeks each. Every single day is given a rating on an integer-only scale from 1 to 5, where the greater the number, the better my overall mood that day.
### Columns and named ranges
Therefore I've labelled 4 columns: **Season (A)**, **Week (within a season; B)**, **Date (C)**, and **Rating (D)**. The first row is reserved for column headers, so the actual tracker entries start at row 2. They go all the way to row 113, as I'm looking at 16 weeks, so 112 days plus 2 (the first row number) minus 1 (this range includes the first row).
Speaking of ranges, I've created named ranges called the same as the respectively named columns, as well as a named range for all the raw data named _Entries_, so as to make filter operations more readable. We'll see that in action shortly.
Anyway, I've got the columns set up, but what about the rows? Filling in the last two columns is straightforward: I enter the first date and drag the cell down to the final entry, whereas the ratings... well, I assign them on a day-by-day basis. As for the other two columns, I've made use of clever formulas to automate the process of splitting the entries in the manner I described earlier.
### Splitting rows into seasons and weeks
I'll break down the **Season** column first. Since every row corresponds to a single day and every season is 28 days long, I take the zero-based row index (i.e. subtract 2 from `ROW()`), divide it by the number of days in a season, discard the decimal places from the division result (or in other words, `FLOOR()` it), and add 1 to adjust the resulting season index to one-based numbering. The formula looks like this:
```scala
=FLOOR((ROW()-2)/28)+1
```
{{ admonition(type="note", title="WHAT'S UP WITH SCALA?" text="Code snippets use Scala syntax highlighting, because Zola doesn't provide a dedicated language option for Excel formulas, and Scala just so happens to work well enough as a substitute.") }}
Now it's time for the **Week** column. I determine the absolute week number using an analogous approach, but this time dividing the zero-based row index by 7. However, I want to have the week number go back to 1 after I go from the last day of one season to the first day of the following season.
In order to have the week cycle from 1 to 4, I take the remainder from division of the week number by 4 and add one to that instead of the week index. This results in the following following formula:
```scala
=MOD(FLOOR((ROW()-2)/7),4)+1
```
Alright, so let's have a look at an excerpt from the live demo, filtered to show only the first row for every week of every season. We should see the aforementioned cycle as every date jumps by 7 days, with each reset causing the season to go up by 1.
| Season | Week | Date |
| ------ | ---- | ---------- |
| 1 | 1 | 23.09.2025 |
| 1 | 2 | 30.09.2025 |
| 1 | 3 | 07.10.2025 |
| 1 | 4 | 14.10.2025 |
| 2 | 1 | 21.10.2025 |
| 2 | 2 | 28.10.2025 |
| 2 | 3 | 04.11.2025 |
| 2 | 4 | 11.11.2025 |
| 3 | 1 | 18.11.2025 |
| 3 | 2 | 25.11.2025 |
| 3 | 3 | 02.12.2025 |
| 3 | 4 | 09.12.2025 |
| 4 | 1 | 16.12.2025 |
| 4 | 2 | 23.12.2025 |
| 4 | 3 | 30.12.2025 |
| 4 | 4 | 06.01.2026 |
Looks good to me! Let's move on.
## Weekday rating averages
As the header suggests, this next sheet aims to showcase the average rating for every day of the week throughout the whole season. The table consists of two columns: **Weekday (C)**, and **Average (D)**. Like in the _main_ data table, the first row is reserved for column header.
For **Weekday**, I had to adapt to the default numbering of weekdays used by Google Sheets and the [`WEEKDAY` function](https://support.google.com/docs/answer/3092985?hl=en), where 1 means Sunday, 2 means Monday, and so on until you reach Saturday at number 7. I've also applied custom date formatting to that column to display the corresponding weekday's short name.
Here's how the **Average** column gets filled out. For each row, I filter through all the _Entries_ by selecting only the ones with dates that fall on the weekday of the current row, extracting the 4th column from the results (it's the **Ratings** column) and finally calculating the average.
All it takes is writing the following formula for the first weekday row (the second row overall) and dragging it all the way to the final weekday:
```scala
=AVERAGE(
CHOOSECOLS(
FILTER(Entries, WEEKDAY(Date)=C2),
4
)
)
```
Now, I know that the [`QUERY` function](https://support.google.com/docs/answer/3093343?hl=en) is a thing, but I thought I'd spice things up by deliberately refraining from using it. Anyway, here we have the resulting table:
| Weekday | Average |
| ------- | ------- |
| Mon | 2.50 |
| Tue | 3.13 |
| Wed | 3.13 |
| Thu | 2.25 |
| Fri | 2.94 |
| Sat | 3.44 |
| Sun | 3.06 |
## Rating counts
Next up is a sheet for listing each rating's count in every week of a certain season and overall. It consists of a dropdown menu for season selection, as well as a table for presenting the counts on a week-by-week and overall basis.
The dropdown menu is located at cell `B1` and allows you to choose a unique value from the `=Season` range. There's no need to explicitly place it inside the [`UNIQUE` function](https://support.google.com/docs/answer/10522653?hl=en), Google Sheets will automatically remove all the duplicates for you.
### Named functions bonanza
Now, the real fun begins with the aforementioned table. It has 5 columns and 5 rows, where every column header represents a specific rating, and every row header represents a period within selected season (the first four label their respective weeks, the last one labels the whole season). Each cell contains the number of occurrences of a certain rating in a certain period.
Here's the kicker: this whole table (including the headers) is generated using formulas. The column headers are nothing else but a simple [`=SEQUENCE(1,5)`](https://support.google.com/docs/answer/9368244?hl=en), which outputs 1 row and 5 columns (provided parameters) of subsequent integers from 1 onwards (by default). Easy as it can be, but what about the rest of the table?
I've managed to generate it using a single expression... sort of. In case you haven't just figured it out, I've created a bunch of [named functions](https://support.google.com/docs/answer/12504534?hl=en) to move some potentially reusable pieces to their own formulas, but also to improve general readability of the main formula. I'll explain them first, and then I'll put them all together to create _the big one_.
#### `ROW_HEADER(week)`
This function accepts an integer in range `[1; 5]` and returns appropriate text, which is meant to serve as the header for a row representing a specific week or the overall period. Here's the definition:
```scala
=IF(week=5, "Overall", JOIN(" ", "Week", week))
```
That's incredibly reusable, but also incredibly boring. Let's examine something more... bizarre.
#### `RATINGS_FOR_PERIOD(entries, season, week)`
This function accepts:
1. A table just like the one under the `Entries` named range.
2. An integer in range `[1; 4]`.
3. An integer in range `[1; 5]`.
It returns an array of ratings registered in a given `week` of a given `season`, or all the ratings throughout specified `season` if `week` is set to 5. Here's the definition:
```scala
=CHOOSECOLS(
FILTER(
entries,
(CHOOSECOLS(entries,1)=season)*((week=5)+(CHOOSECOLS(entries,2)=week))
),
4
)
```
You might be wondering what the [`*` (multiply)](https://support.google.com/docs/answer/3093978?hl=en) and [`+` (add)](https://support.google.com/docs/answer/3093590?hl=en) operators do here and how they're supposed to work in this big filter condition. Although this doesn't seem to be documented in the reference pages I linked to, apparently you can use the respective operators as substitutes of [logical `AND`](https://support.google.com/docs/answer/3093301?hl=en) and [logical `OR`](https://support.google.com/docs/answer/3093306?hl=en).
The reason why I've had to replace these functions with operators, is that for whatever reason, the left-hand side of the `AND` would get evaluated early, altering the `entries`' dimensions while the right-hand side was still expecting the initial dimensions, causing a value error in the process.
Making sense of this whole function should be much easier now. It essentially selects the **Ratings** column from all the `entries` registered in:
- Provided `season` overall if the `week` parameter is set to 5.
- Provided `week` of that `season` otherwise.
#### `LIST_COUNTS(week, ratings)`
This function accepts:
1. An integer in range `[1; 5]`.
2. An array of integers in range `[1; 5]`.
It returns a row containing a text header, along with the number of occurrences of each rating in the provided `ratings` array. Here's the definition:
```scala
={
ROW_HEADER(week),
MAP(
SEQUENCE(1, 5),
LAMBDA(r, COUNTIF(ratings, r))
)
}
```
One very cool thing to note, is that you don't have to manually [`FLATTEN`](https://support.google.com/docs/answer/10307761?hl=en) the mapped sequence, because the comma (or backslash depending on your regional settings) is responsible for joining two rows together into one continuous row. If you'd like to join two columns, use a semicolon instead.
### The Big One
Ok, it's time to piece together the formula to generate the rest of the table. I started off with a column sequence (ie. `=SEQUENCE(5,1)`), with each row item acting as a `week` number.
I could then apply a `MAP` to generate a row via the `LIST_COUNTS` named function, passing the aforementioned week number, along with an array of ratings obtained by calling `RATINGS_FOR_PERIOD` with the `Entries` named range, the selected season at cell `B1`, and the same `week` number.
Translating this description to a formula using the named functions above looks like this:
```scala
=MAP(
SEQUENCE(5, 1),
LAMBDA(
week,
LIST_COUNTS(
week,
RATINGS_FOR_PERIOD(Entries, B1, week)
)
)
)
```
It produces the table below (for season 1 in the live demo):
{% wide_container() %}
| Period | 1 | 2 | 3 | 4 | 5 |
| ------- | --- | --- | --- | --- | --- |
| Week 1 | 1 | 4 | 2 | 0 | 0 |
| Week 2 | 2 | 1 | 2 | 2 | 0 |
| Week 3 | 1 | 0 | 1 | 2 | 3 |
| Week 4 | 2 | 1 | 0 | 3 | 1 |
| Overall | 6 | 6 | 5 | 7 | 4 |
{% end %}
Just to drive home the fact that this formula is indeed _the big one_, here's an inlined variant (ie. one where all the named function calls have been replaced by their bodies):
```scala
=MAP(
SEQUENCE(5, 1),
LAMBDA(
w,
{
IF(w=5, "Overall", JOIN(" ", "Week", w)),
MAP(
SEQUENCE(1, 5),
LAMBDA(
r,
COUNTIF(
CHOOSECOLS(
FILTER(
Entries,
(Season=B1)*((w=5)+(Week=w))
),
4
),
r
)
)
)
}
)
)
```
Imagine trying to debug this monster without all the formatting. Yeah, no thanks!
## Rating distributions
This sheet is actually very similar to the previous one, both when it comes to the underlying formulas and the resulting table. The latter has the same row labels, only this time the columns provide a [five-number summary](https://en.wikipedia.org/wiki/Five-number_summary) of ratings for a given period in the selected season. I've labelled the columns manually as follows:
1. Min
2. Q1 (first quartile)
3. Median
4. Q3 (third quartile)
5. Max
6. IQR (interquartile range)
As for the rest of the table, the main formula really does look familiar:
```scala
=MAP(
SEQUENCE(5, 1),
LAMBDA(
week,
LIST_QUARTILES(
week,
RATINGS_FOR_PERIOD(Entries, B1, week)
)
)
)
```
I've only had to replace the `LIST_COUNTS` call with a call to `LIST_QUARTILES`, which goes to show how powerful named functions can prove when designing complex formulas. Let's take a look at this new function:
```scala
={
ROW_HEADER(week),
MAP(
SEQUENCE(1, 5, 0),
LAMBDA(q, QUARTILE(ratings, q))
)
}
```
Simple and elegant, especially when you take advantage of the fact that passing 0, 2, and 4 as the second argument of the [`QUARTILE` function](https://support.google.com/docs/answer/3094041?hl=en) will have it return the minimum, median, and maximum value in the dataset respectively.
But where's the IQR column? Although I could've incorporated it into the formula, I've chosen to add a separate column that's effectively `Q3-Q1`, and voila:
{% wide_container() %}
| Period | Min | Q1 | Median | Q3 | Max | IQR |
| ------- | --- | --- | ------ | --- | --- | --- |
| Week 1 | 1 | 2 | 2 | 2.5 | 3 | 0.5 |
| Week 2 | 1 | 1.5 | 3 | 3.5 | 4 | 2 |
| Week 3 | 1 | 3.5 | 4 | 5 | 5 | 1.5 |
| Week 4 | 1 | 1.5 | 4 | 4 | 5 | 2.5 |
| Overall | 1 | 2 | 3 | 4 | 5 | 2 |
{% end %}
## Season summary
At this point, I'm pretty sure you know the drill. I've wanted to create a table that would list some key measures for each week in the season and overall. We're talking the minimum, maximum, average, median, mode, and standard deviation.
The `SUMMARISE(week, ratings)` named function I've written to list those stats isn't quite as elegant as `LIST_QUARTILES`, but it's far from terrible:
```scala
={
ROW_HEADER(week),
MIN(ratings),
MAX(ratings),
AVERAGE(ratings),
MEDIAN(ratings),
JOIN("; ", MODE.MULT(ratings)),
STDEVP(ratings)
}
```
The only thing worth noting about this function, is that I reckon it's more reasonable to treat provided ratings as the entire population rather than a sample when dealing with weeks, hence the [`STDEVP`](https://support.google.com/docs/answer/3094105?hl=en) instead of [`STDEV`](https://support.google.com/docs/answer/3094054?hl=en), or using the former only for the overall period and the latter for weeks.
With that out of the way, this is the autogenerated table for season 1 in the live demo:
{% wide_container() %}
| Period | Min | Max | Average | Median | Mode | Std. dev. |
| ------- | --- | --- | ------- | ------ | ------- | --------- |
| Week 1 | 1 | 3 | 2.14 | 2 | 2 | 0.64 |
| Week 2 | 1 | 4 | 2.57 | 3 | 1; 3; 4 | 1.18 |
| Week 3 | 1 | 5 | 3.86 | 4 | 5 | 1.36 |
| Week 4 | 1 | 5 | 3.00 | 4 | 4 | 1.51 |
| Overall | 1 | 5 | 2.89 | 3 | 4 | 1.37 |
{% end %}
## It's a wrap
Thank you for reading this article all the way through. Although the new academic year is underway, I can't wait to do more data science experiments in my spare time and write them up in a similar fashion... provided I find enough energy, haha.
Take care, and I hope to see you in the next post!

View File

@@ -0,0 +1 @@
abuseipdb-verification-inSaoelZ

View File

@@ -12,20 +12,40 @@ footer {
text-align: justify;
}
#notbyai {
#badges {
width: 100%;
max-width: 150px;
max-width: var(--max-layout-width);
display: flex;
justify-content: center;
align-items: baseline;
flex-wrap: wrap;
gap: 1.25rem;
margin: 0 auto 1.75rem auto;
}
#badges a {
max-width: 150px;
transition-duration: 300ms;
}
#notbyai:hover,
#notbyai:hover::before {
#badges a:hover,
#badges a:hover::before {
background-color: transparent;
color: transparent;
transform: scale(1.1);
}
#notbyai:active {
#badges a:active {
transform: scale(0.95);
}
#abuseipdb-badge {
width: 150px;
background: #35c246 linear-gradient(
rgba(255,255,255,0),
rgba(255,255,255,.3) 50%,
rgba(0,0,0,.2) 51%, rgba(0,0,0,0)
);
padding: 5px;
box-shadow: 2px 2px 1px 1px rgba(0, 0, 0, .2);
}

View File

@@ -0,0 +1,289 @@
%YAML 1.2
---
# The following syntax definition is based on:
# https://github.com/MinecraftCommands/syntax-mcfunction/tree/51eb8bf4ca04355bb89f5538b7331cb0c0f3df2b
name: mcfunction
file_extensions: [mcfunction]
scope: source.mcfunction
contexts:
main:
- include: root
comments:
- match: '^\s*(#[>!#])(.+)$'
captures:
"1": comment.block.mcfunction
"2": markup.bold.mcfunction
push:
- meta_scope: meta.comments
- include: comments_block
- match: ^(?!#)
pop: true
- match: ^\s*#.*$
scope: meta.comments
captures:
"0": comment.line.mcfunction
comments_block:
- match: '^\s*#[>!]'
push:
- meta_scope: meta.comments_block
- include: comments_block_emphasized
- match: $
pop: true
- match: '^\s*#'
push:
- meta_scope: meta.comments_block
- include: comments_block_normal
- match: $
pop: true
comments_block_emphasized:
- include: comments_block_special
- match: \S+
scope: meta.comments_block_emphasized
captures:
"0": markup.bold.mcfunction
comments_block_normal:
- include: comments_block_special
- match: \S+
scope: meta.comments_block_normal
captures:
"0": comment.block.mcfunction
comments_block_special:
- match: '@\S+'
scope: meta.comments_block_special
captures:
"0": markup.heading.mcfunction
- include: resource-name
- match: "[#%$][A-Za-z0-9_.#%$]+"
scope: meta.comments_block_special
captures:
"0": variable.other.mcfunction
comments_inline:
- match: "#.*$"
scope: meta.comments
captures:
"0": comment.line.mcfunction
literals:
- match: \b(true|false|True|False)\b
scope: meta.literals
captures:
"0": constant.numeric.boolean.mcfunction
- match: '\b[0-9a-fA-F]+(?:-[0-9a-fA-F]+){4}\b'
scope: meta.names
captures:
"0": variable.uuid.mcfunction
- match: '[+-]?\d*\.?\d+([eE]?[+-]?\d+)?[df]?\b'
scope: meta.literals
captures:
"0": constant.numeric.float.mcfunction
- match: '[+-]?\d+(b|B|L|l|s|S)?\b'
scope: meta.literals
captures:
"0": constant.numeric.integer.mcfunction
- match: \.\.
scope: meta.ellipse.literals
captures:
"0": variable.other.mcfunction
- match: '"'
push:
- meta_scope: string.quoted.double.mcfunction
- include: literals_string-double
- match: '"'
pop: true
- match: "'"
push:
- meta_scope: string.quoted.single.mcfunction
- include: literals_string-single
- match: "'"
pop: true
literals_string-double:
- match: \\.
scope: meta.literals_string-double
captures:
"0": constant.character.escape.mcfunction
- match: \\
scope: meta.literals_string-double
captures:
"0": constant.character.escape.mcfunction
- include: macro-name
- match: '[^\\"]'
scope: meta.literals_string-double
captures:
"0": string.quoted.double.mcfunction
literals_string-single:
- match: \\.
scope: meta.literals_string-single
captures:
"0": constant.character.escape.mcfunction
- match: \\
scope: meta.literals_string-double
captures:
"0": constant.character.escape.mcfunction
- include: macro-name
- match: '[^\\'']'
scope: meta.literals_string-single
captures:
"0": string.quoted.single.mcfunction
macro-name:
- match: '(\$\()([A-Za-z0-9_]*)(\))'
scope: meta.macro-name
captures:
"1": punctuation.definition.template-expression.begin.mcfunction
"2": variable.other.mcfunction
"3": punctuation.definition.template-expression.end.mcfunction
names:
- match: '^(\s*)([a-z_]+)(?=\s)'
scope: meta.names
captures:
"1": whitespace.mcfunction
"2": keyword.control.flow.mcfunction
- match: '^(\s*)(\$)( ?)([a-z_]*)'
scope: meta.names
captures:
"1": whitespace.mcfunction
"2": markup.italic.mcfunction
"3": whitespace.mcfunction
"4": keyword.control.flow.mcfunction
- match: '(run)(\s+)([a-z_]+)'
scope: meta.names
captures:
"1": entity.name.mcfunction
"2": whitespace.mcfunction
"3": keyword.control.flow.mcfunction
- include: resource-name
- match: '[A-Za-z]+(?=\W)'
scope: meta.names
captures:
"0": entity.name.mcfunction
- match: "[A-Za-z_][A-Za-z0-9_.#%$]*"
scope: meta.names
captures:
"0": string.unquoted.mcfunction
- include: macro-name
- match: '([#%$]|((?<=\s)\.))[A-Za-z0-9_.#%$\-]+'
scope: meta.names
captures:
"0": variable.other.mcfunction
operators:
- match: "[~^]"
scope: meta.operators
captures:
"0": constant.numeric.mcfunction
- match: '[\-%?!+*<>\\/|&=.:,;]'
scope: meta.operators
captures:
"0": keyword.operator.mcfunction
property:
- match: '\{'
push:
- meta_scope: meta.property.curly
- include: resource-name
- include: literals
- include: property_key
- include: operators
- include: property_value
- include: main
- match: '\}'
pop: true
- match: '\['
push:
- meta_scope: meta.property.square
- include: resource-name
- include: literals
- include: property_key
- include: operators
- include: property_value
- include: main
- match: '\]'
pop: true
- match: \(
push:
- meta_scope: meta.property.paren
- include: resource-name
- include: literals
- include: property_key
- include: operators
- include: property_value
- include: main
- match: \)
pop: true
property_key:
- match: '#?[a-z_][a-z_\.\-]*\:[a-z0-9_\.\-/]+(?=\s*\=:)'
scope: meta.property_key
captures:
"0": variable.other.mcfunction
- match: '#?[a-z_][a-z0-9_\.\-/]+'
scope: meta.property_key
captures:
"0": variable.other.mcfunction
- match: '[A-Za-z_]+[A-Za-z_\-\+]*'
scope: meta.property_key
captures:
"0": variable.other.mcfunction
property_value:
- match: '#?[a-z_][a-z_\.\-]*\:[a-z0-9_\.\-/]+'
scope: meta.property_value
captures:
"0": string.unquoted.mcfunction
- match: '#?[a-z_][a-z0-9_\.\-/]+'
scope: meta.property_value
captures:
"0": string.unquoted.mcfunction
resource-name:
- match: "#?[a-z_][a-z0-9_.-]*:[a-z0-9_./-]+"
scope: meta.resource-name
captures:
"0": entity.name.function.mcfunction
- match: '#?[a-z0-9_\.\-]+\/[a-z0-9_\.\-\/]+'
scope: meta.resource-name
captures:
"0": entity.name.function.mcfunction
root:
- include: literals
- include: comments
- include: say
- include: names
- include: comments_inline
- include: subcommands
- include: property
- include: operators
- include: selectors
say:
- match: ^(\s*)(say)
captures:
"1": whitespace.mcfunction
"2": keyword.control.flow.mcfunction
push:
- meta_scope: meta.say.mcfunction
- match: \n
pop: true
- match: \\\s*\n
captures:
"0": constant.character.escape.mcfunction
- include: literals_string-double
- include: literals_string-single
- match: (run)(\s+)(say)
captures:
"1": entity.name.mcfunction
"2": whitespace.mcfunction
"3": keyword.control.flow.mcfunction
push:
- meta_scope: meta.say.mcfunction
- match: \n
pop: true
- match: \\\s*\n
captures:
"0": constant.character.escape.mcfunction
- include: literals_string-double
- include: literals_string-single
selectors:
- match: "@[a-z]+"
scope: meta.selectors
captures:
"0": support.class.mcfunction
subcommands:
- match: "[a-z_]+"
scope: meta.literals
captures:
"0": entity.name.class.mcfunction

View File

@@ -1,19 +1,33 @@
<a
id="notbyai"
href="https://notbyai.fyi"
target="_blank"
rel="noopener noreferrer"
>
<img
class="img-light"
loading="lazy"
src="/images/notbyai.png"
alt="Written by human, not by AI."
/>
<img
class="img-dark"
loading="lazy"
src="/images/notbyai-dark.png"
alt="Written by human, not by AI."
/>
</a>
<div id="badges">
<a
href="https://notbyai.fyi"
target="_blank"
rel="noopener noreferrer"
title="Written by human, not by AI."
>
<img
class="img-light"
loading="lazy"
src="/images/notbyai.png"
alt="Written by human, not by AI."
/>
<img
class="img-dark"
loading="lazy"
src="/images/notbyai-dark.png"
alt="Written by human, not by AI."
/>
</a>
<a
href="https://www.abuseipdb.com/user/264039"
target="_blank"
rel="noopener noreferrer"
title="AbuseIPDB is an IP address blacklist for webmasters and sysadmins to report IP addresses engaging in abusive behavior on their networks."
>
<img
id="abuseipdb-badge"
src="https://www.abuseipdb.com/contributor/264039.svg"
alt="AbuseIPDB Contributor Badge"
/>
</a>
</div>