Git vibes?

Another day, another project nobody really needs… but here it is! ๐Ÿ˜

As I was reorganizing some stuff, mostly on multiple Git repositories (both, internal and GitHub), I noticed that sometimes I forget to save my work. So, to remember (and potentially stop this from happening), came up with a small and simple PowerShell script… helping me to remember. Of course, it was vibe-coded with the ChatGPT’s help.

The idea behind it – I have a local folder with multiple projects/repositories on which I’m working. As I mostly switch from one thing to another (or a computer to computer, or …), I sometimes do something and forget to save it on the remote Git instance. And as things with computers tend to happen… ๐Ÿ™‚

Smart people of the Internet say (borrowed from https://mastodon.social/@nixCraft/111489234007874526):

So, to potentially stop forgetting (and losing my work), a simple PowerShell script (Check-GitRepos.ps1) goes into the local “projects” folder, gets the latest updates, asks and commits the local changes (if there are some). For now, it doesn’t create additional branches, commits to them, etc. – it may be a feature of the next version.

Current version is just fine for my personal “use case” – smart, simple and quick.

Examples of running it on a local folder:

So, not much else to add – it does what it’s supposed to do. And, as always, it’s available on my GitHub.

Cheers!

P.S. Yeah, I also thought about the question that presents itself – and who will remind me to run the script?! Oh, well… ๐Ÿคทโ€โ™‚๏ธ๐Ÿ˜…

P.P.S. There is also git-fire, which may help with the emergencies.

Organize pictures and videos… the “vibe” way

The idea for this one came to mind one summer evening, when I was searching for something on my disks, and realized – it’s a mess.

So, started figuring out this mess by first organizing images and videos backed up from my phone(s) into folders. Phone backups are a nice thing… and usually it’s all in a single folder.

OK, there are options… but it is what it is – now I have a folder called like “Mobile-Backup-XXX”, with all files in it… no subfolders. ๐Ÿคทโ€โ™‚๏ธ

Of course, when I opened this folder with thousands of files, and started moving them manually to respective subfolders, it soon became clear that I need help (OK, maybe that was obvious from the start ๐Ÿ˜).

A whom do you call for help these days? Ghostbusters? ๐Ÿค”

Well, no – the answer is always “AI”. More precisely, I called (free) ChatGPT.

Long story short, it helped me to write a nice PowerShell script which will take my folder with thousands of files and slightly organize it by moving those files into (sub)folders named by the date they were taken or created.

After some time, we got the script working, some logging was added, and it was ready for testing – tested it on a few folders, and then realized that sometimes it has issues with reading the “right” metadata, so we reengineered that part.

Some time later, after some other tiny things were polished, script was ready and doing it’s work just as I expected it! Nice!

Now, instead of a folder with thousands of images and videos, I have a folder with hundreds of subfolders… ๐Ÿ˜…

And if you put stuff into folders, you don’t have to look at it, and it doesn’t bother you anymore, right?! ๐Ÿ˜

But OK – it’s a first step in organizing stuff! ๐Ÿ˜Š

Could it be improved?! Of course! But… ๐Ÿคทโ€โ™‚๏ธ

The script (Organize-PicsAndVids.ps1) is, as always, available on my GitHub.

Cheers!

P.S. This was also somewhat inspired by an episode from โ€œScott and Mark Learn Toโ€ฆโ€ย series of podcasts by Scott Hanselman and Mark Russinovich โ€“ make sure you subscribe and watch them regularly! They rock! ๐Ÿ˜Š

Vibe coding while waiting on my packages

Recently, I was waiting on some packages to arrive via DHL Germany. As I also had some spare time, and it was raining outside, I was playing around with PowerShell and DHL’s API to see if I can get status of my packages in a nice (to me), formatted way.

TL;DR: I can. ๐Ÿ™‚

The idea was to build a small script which will take in a list of tracking numbers from my incoming packages, check their latest statuses and any history it can find, and display it in a nice, formatted way.

So, it all started with exploring the API possibilities on DHL’s Developer website.

To be able to use their APIs, you must register (for a free account):

Next, you have to request API access, so that you get your API key and secret – you do this by creating an app and selecting APIs it will use (selected Shipment Tracking – Unified, as it sounded right):

Then you wait a bit for someone/something to approve your request (few minutes), and you are ready to go:

Now you have all the building blocks, and it’s time to code… finally. ๐Ÿ™‚

I won’t explain the code line by line (it’s not that interesting and it could be written nicer), but will just say that I used the help of (the free) ChatGPT (or rather – it used my ideas?! Who knows… ๐Ÿ˜), and “vibe-coded” the whole thing. There were errors, misunderstandings, etc., but we managed to get to something that does the job:

Just one thing – as I was looking at the outputs, I’ve seen these “detailed info” codes, and decided to explore what they mean – for this, I found a CSV file with the explanations, which I then decided to incorporate into my script… just for fun.

The script (Track-DHLShipment.ps1) is available on my GitHub.

What’s next? Don’t know… probably my packages will indeed arrive. ๐Ÿ™‚

Cheers!

P.S. This was somewhat inspired by an episode fromย “Scott and Mark Learn To…” series of podcasts by Scott Hanselman and Mark Russinovich – make sure you subscribe and watch them regularly! They rock! ๐Ÿ˜Š

Using a self-hosted runner with GitHub Actions

As I was going through the excellent short course called Azure Infrastructure as Code with GitHubย (by fellow MVP, Barbara Forbes), a thought appeared โ€“ what do I need to do to use my custom runner machine inside a pipeline forโ€ฆ I don’t knowโ€ฆ security/privacy concerns, isolation, special requirements, different OS, control, priceโ€ฆ or just to complicate things a bit?

Of course, GitHub supports this and it’s called a self-hosted runner.

So, what do I need to do to use this self-hosted runner with my GitHub Actions?

It’s relatively simple โ€“ there is an application package, which will be installed on your runner machine, and which will listen for and eventually do all the work defined in your workflow!

But first, let’s introduce my environment.

I have a simple GitHub Action (workflow), which creates a simple storage account on my Azure environment (there is actually no need to convert Bicep to ARM before deployment, but it seemed cool ๐Ÿ˜€). It’s currently using the โ€žubuntu-latestโ€œ runner, provided by GitHubโ€ฆ which has also all the needed components inside (like Azure CLI, Azure PowerShell, โ€ฆ).

And it works fine.ย When there is a push to my GitHub repository, GitHub Actions starts and does what is needed on my Azure environment via this workflow:

And the mighty Bicep file (๐Ÿ˜€) it’s using for the deployment is:

Of course, this runs just fine on a standard (hosted) runner:

To run this workflow (successfully) not that much is needed.

First, I’ve created a new virtual machine (I’ll use a simple Ubuntu Hyper-V VM, no autoscaling, noโ€ฆ nothing) called hermes (god of speed ๐Ÿ˜€), with freshly installed Ubuntu 22.04.1-LTS (minimized).

After that, I went to the Settings of my GitHub repository and got the download and install scripts for the x64 Linux runner:

As you can see, I’ll be using crontab later to automatically (re)start my self-hosted runner.

If everything went well, you should see your runner “up and running” (๐Ÿ˜€) in the GitHub portal:

Next, I’ll use the following script to install all prerequisites for my workflow (like Azure CLI, Azure PowerShell, etc. – it really depends on your workflow and things you use):

Once this is done, my self-hosted runner hermes should be ready to run the workflow.

To try this, I need to make a slight update to my workflow file – line 12 inside the job configuration should be updated from “runs-on: ubuntu-latest” to “runs-on: self-hosted“.

So, my workflow YAML file now looks like this:

And once I push the configuration to my GitHub, my workflow automatically starts and runs on hermes, my self-hosted runner:

If we prepared our runner right, all is good! ๐Ÿ˜Š

Of course, our resources are deployed successfully:

So, this is how you can use your own, self-hosted runner, to execute your GitHub Actions (workflows).

Cheers!

Capturing network trace in Windows

Do you need to capture some network traffic on a Windows box for further analysis, but don’t want to install additional software just… everywhere?

I usually do.

If you didn’t know, Windows has built-in tool with which you can do just that – (among other things) capture network trace to a file for further analysis. The tool is called netsh.

So, how do you capture traffic with netsh?

It’s fairly easy (for more options, filters and such, you can always check the accompanying help content – netsh trace start ?):

If you look at the location where you’ve saved your trace, you’ll see two files – of those two files, MyTrace.etl is the one you want:

OK, but what do you do with it?

If you try to open it with, for example, WireShark, you’ll see it doesn’t work:

So… we have a trace file with which we can’t really do anything?!?

Not exactly!

If you have Microsoft Network Monitor (now archived, but can be found… on the Internet) or Microsoft Message Analyzer (now retired), you can open up and analyze your trace as you normally would:

If you already have WireShark on, let’s say, your workstation, and want to continue using it for the analysis, this trace needs to be converted to a format which WireShark understands (hope that one day we’ll have WireShark which opens such .etl files natively).

You can convert it by using the free tool called etl2pcapng.

It doesn’t require installation, and if you want to use the pre-compiled binaries, they are available under etl2pcapng releases.

So, convert your (netsh) MyTrace.etl to (WireShark’s) MyTrace.pcapng with this command:

Once converted, you can open the new file (MyTrace.pcapng) in WireShark, and do what you would usually do to analyze it:

Hope this helps!

Cheers!