Azure bites



  • A short welcome message from my fancy new Windows 11 virtual machine running in Azure.

    Yes, I did the step, and registered for a "free" Azure account.
    Currently, I am installing Visual Studio 2022 Community edition. And then I would like to test how compatible that is with VS 2019 on my big old machine...

    When creating this VM, I was not able to choose "Azure Germany" (that was not even listed, wtf was it shown during the training?) as the location, so I put it in "Azure North Europe". At least, cooling won't be an issue there.



  • @BernieTheBernie If I understand correctly, “Azure Germany” is one of the completely special environments, “clouds”, along with “Azure US Government” and “Azure China” (and the default “Azure Public”). When you create the free account, you are in “Azure Public Cloud” (you need a completely separate ‘tenant’ for the other clouds), the default interconnected environment, within which you can choose locations like “West Europe” (in Netherlands) or “North Europe” (in Ireland). There is also “Germany West Central”, but since it is more expensive, it is possible it is not being offered in the free tier.

    Quick search also shows that the “Azure Germany” no longer accepts new customers, presumably because the “Germany West Central” location is now good enough for all the compliance purposes the “Azure Germany Cloud” existed for, so that's another point in which the training is obsolete.



  • @Bulb said in Azure bites:

    “West Europe” (in Netherlands) or “North Europe” (in Ireland)

    "North Europe" is not significantly further north than "West Europe", but it is much further west than "West Europe".



  • @HardwareGeek Microsoft and naming. What did you expect?



  • @Bulb said in Azure bites:

    @HardwareGeek Microsoft and naming. What did you expect?

    Germany 365



  • @Kamil-Podlesak bold claims about uptime there. Are we sure they’re still on track for 365 days this year?


  • Discourse touched me in a no-no place

    @Arantor I think they've managed to hit their 36.5% uptime guarantee this year.



  • @Kamil-Podlesak said in Azure bites:

    @Bulb said in Azure bites:

    @HardwareGeek Microsoft and naming. What did you expect?

    Germany 365

    Thanks for selecting a rather innocent number.


  • BINNED

    @Kamil-Podlesak said in Azure bites:

    Germany 365

    Ein Reich Ein Cloud


  • Considered Harmful

    @Luhmann said in Azure bites:

    @Kamil-Podlesak said in Azure bites:

    Germany 365

    Ein Reich Ein Cloud

    Eine Wolke


  • BINNED

    @Applied-Mediocrity
    There is moonspeak in your wolke



  • So I tried to set up a Debian 12 virtual machine on Azure. Should be easy, you'd think: enter "Debian" in the search box, select "Debian 12 bookworm" from the marketplace, and voilà, you can enter the details. Except when you cannot.
    The machine needs a "size". It defaulted to "Standard_B4ms" which I used for the Windows 11 machine, and yes, I would want to use that here, too. But it is not available. Click on the "See all sizes" link, and voilà, a whole page of sizes which are all again not available.
    Only after moving the requested place from North Europe (Dublin) to Central Sweden did it work.
    Great experience!
    AzureDebian.JPG


  • And then the murders began.

    @BernieTheBernie There’s a quota on how many vCPUs you can have in each region. If you increase the quota in your original region, per the error message, it would work.



  • @Unperverted-Vixen I tried that, too. But I use the "Azure Free Account" for playing a little around, so I am not entitled to do so...
    Seems that the "biggest" machine I can get has 4 vCPU and 32 GB memory.
    Uhm... Do I use a "cloud", or is it just mist/fog, or even smog?
    🤔



  • And I found another cutie. I have a storage account. When creating it, I made sure that I can set anonymous access for individual containers in the storage account.
    So I created a container with public access to the container:
    PublicContainer.JPG
    But when ever I call it in the browser, I get an error message:

    <Error>
        <Code>ResourceNotFound</Code>
        <Message>The specified resource does not exist. RequestId:5960587e-401e-0021-364d-30b5f0000000 Time:2023-12-16T18:24:41.6370577Z</Message>
    </Error>
    

    But I can access files in it, e.g.

    https://storagebernie.blob.core.windows.net/testpublic/IMGP8315.JPG
    

    works as intended...
    :wtf_owl: :wtf-whistling:

    Btw, the storage account is located in "Azure North Europe", i.e. Dublin.
    DogsB: What did you expect from drunken Irish men? :yell-at-cloud:


  • Discourse touched me in a no-no place

    @BernieTheBernie said in Azure bites:

    @Unperverted-Vixen I tried that, too. But I use the "Azure Free Account" for playing a little around, so I am not entitled to do so...
    Seems that the "biggest" machine I can get has 4 vCPU and 32 GB memory.
    Uhm... Do I use a "cloud", or is it just mist/fog, or even smog?
    🤔

    That's good for free.

    EC2 free tier is 2 vCPU and 1 GB memory max IIRC.



  • @loopback0 With the free account comes a $200 package which can be spent for what ever Microsoft Azure service during the frist month only. So my Win 11 machine has already eaten a few (2-3) dollars of that amount.

    The virtual machines free for a whole year are the types you said: B1s offers 1 vCPU, 1 GB RAM, and 750 hours of use during the frist year (note: you have to deallocate the machine after use, if you merely shut it down, it is still in use and fully billed).


  • Notification Spam Recipient

    @BernieTheBernie said in Azure bites:

    Dublin

    I'm seeing a common here...



  • @BernieTheBernie said in Azure bites:

    But when ever I call it in the browser, I get an error message:

    <Error>
        <Code>ResourceNotFound</Code>
        <Message>The specified resource does not exist. RequestId:5960587e-401e-0021-364d-30b5f0000000 Time:2023-12-16T18:24:41.6370577Z</Message>
    </Error>
    

    But I can access files in it, e.g.

    https://storagebernie.blob.core.windows.net/testpublic/IMGP8315.JPG
    

    works as intended...

    Storage services have stopped supporting directory listings a … well, as far as I know none of the cloud ones does.

    There is a separate configuration page for setting the storage up as static website, but I don't know whether it supports generating indices or just serving them.

    That said, I would prefer if the protocol was compatible with webdav rather than a custom http-based api.



  • @Bulb Yeah. I know for S3 at least, "directories" aren't really anything of the sort. It's all tags, sharded on some set of disks in an internally-sensible manner for storage/recovery/etc optimizations rather than trying to keep it all representable to a naive, unaware consumer. So your conventional "list this directory" approaches just don't even make sense. You can absolutely write a client that translates the tag hierarchy into directories--I use on as part of my VTT. But I wouldn't expect the bare APIs to do that at all.



  • @Benjamin-Hall In Azure it is by default the same, but there is also an option for “hierarchical namespace”, which enables some features for directories, but I didn't look deeper into which ones.

    It should be possible to enable hadoop protocol with the hierarchical namespace and that should support listing though.



  • … of course it does support listing blobs even in the basic setting, but the URL is https://{account}.blob.core.windows.net/{container}?restype=container&comp=list&prefix={path}. For :raisins:.



  • @Bulb said in Azure bites:

    hadoop protocol

    … and I, again, wonder, why the hell that's based on plain HTTP and not WebDAV (it has some other transport protocols too, but webdav doesn't seem to be one of them).



  • @Bulb said in Azure bites:

    … of course it does support listing blobs even in the basic setting, but the URL is https://{account}.blob.core.windows.net/{container}?restype=container&comp=list&prefix={path}. For :raisins:.

    Anyway, the BlobContainerClient can list the blobs, too.
    Why would anyone think that listing container contents could work in a simple convenient way?
    :raisins:
    💩



  • @BernieTheBernie said in Azure bites:

    Why would anyone think that listing container contents could work in a simple convenient way?

    That would allow customers to easily switch between the different storage solutions from Microsoft, Amazon, Google and others. Microsoft can't have any of that.



  • @Bulb Do you want to imply that Amazon, Google, and others do not use specific magic incantations of such listing?
    That would be truly astonishing.
    :surprised-pikachu:



  • When something goes wrong on Azure...
    What? Something goes wrong on Azure?
    :fun:
    So I wrote a Function App (Amazon's equivalent is called Lambda). The code worked well when I tested it in a console app. But as a function app, it failed. Just failed.
    Eventually I found a place in the Azure portal where I saw an error message.
    Aha: an InvalidOperationException occured when the BlobContainerClient was created. :wtf: ?
    Some where on StackOverflow I found a similar situation, and they pointed to a storage account issue (a function app requires storage space for its configuration). So I went back to the Publish page of Visual Studio. There some item mentionend storage, and next to were 3 dots instead of a green check mark. Clicked on those 3 dots, and the following dialogs, and eventually a green check mark appeared.
    Published the function app again, and voilà, it works! 🎉 🎆
    InvalidOperationException is such a great description of the issue...
    Well, it's :fun:! :yell-at-cloud:



  • @BernieTheBernie said in Azure bites:

    @Bulb Do you want to imply that Amazon, Google, and others do not use specific magic incantations of such listing?
    That would be truly astonishing.
    :surprised-pikachu:

    I'm totally sure they do (slightly different incantation each, of course). The wouldn't have customers switching to another storage easily either, would they?



  • @BernieTheBernie said in Azure bites:

    Some where on StackOverflow I found a similar situation, and they pointed to a storage account issue (a function app requires storage space for its configuration).

    I'm surprised it wasn't configured though. The wizard is supposed to configure it and it's … well, it's not hard to break it, you just delete the magic configuration parameter, but I wouldn't expect that happening.

    So I went back to the Publish page of Visual Studio. There some item mentionend storage, and next to were 3 dots instead of a green check mark. Clicked on those 3 dots, and the following dialogs, and eventually a green check mark appeared.

    … or you were setting it up from the Visual Studio directly, not from the portal? That might explain it.

    Doing anything more than playing around, I quickly abandoned using the portal for creating things, tried the “arm templates”, proceeded to terraform and I'm now contemplating switching to pulumi. Though …

    … I actually got into similar trouble with that too. It wouldn't let me not set up a storage account, but I missed some of the language-specific setting and … just doesn't start, good hunt, buddy. Tried setting it up with the portal, there it worked, so I compared the definitions (fortunately you can always view them) and found it. JSON View and Export Template are really the two most essential functions of the portal.

    Published the function app again, and voilà, it works! 🎉 🎆
    InvalidOperationException is such a great description of the issue...
    Well, it's :fun:! :yell-at-cloud:

    Yeah, the diagnostics is piss-poor. A programmer who knows how to log actually relevant information is extremely rare sight, but the functions runtimes is indeed going out of its way in swallowing any diagnostics from the start-up.


    To be honest when it's up to me I'd just build a normal application using a server framework of choice and deploy it as either container app (if it is a stand-alone component) or into kubernetes (if I have multiple components anyway). Because then I know what's going on and can move it into another cloud when political situation changes and someone decides Microsoft is no longer suitable provider.

    The framework for the function apps does make it a nibble easier to set up connectors to other resources like storages, databases and service busses or event hubs (via dependency injection and standard-ish configuration parameters). But for anything more complex than a quick hack the difference will be small compared to the time needed to implement the actual logic, and you'll have easier time debugging it when you don't use all that magic.

    And it will also somewhat shield you from the forced upgrades. For function apps written in .нет they introduced a new runtime, “.нет isolated”, with .нет 6, and discontinue the old one with .нет 7. The framework is … a bit closer to plain old asp.нет, but totally different from the old one. Different packages, different annotations, different injected argument classes, everything changed. And that on top of them recently completely changing the storage account libraries too—funnily enough they themselves got burnt by that change when they fumbled some refactoring of their own service code and deleted some infrastructure resources they shouldn't have. I think it was mentioned somewhere upthread.



  • Then I started a console app compiled with .Net 2 on the fancy Windows 11 machine. It does not do anything there, but a dialog pops up if I wanted to install .Net 3.5. I click OK, and the thing starts to download the required files.
    At a speed of 3.6 Mbit/s.
    In the Mcirosoft Clouds when downloading files from Microsoft.
    :surprised-pikachu: :yell-at-cloud:
    And during the downloadz, the speed falls down to ... 0.
    Methinks the cloud has integrated into the thick fog of Dublin.



  • @BernieTheBernie said in Azure bites:

    Methinks the cloud has integrated into the thick fog of Dublin.

    Presumably, the cloud in Dublin is drunk.



  • ... and then I resized the machine to the maximum size available to me currently: E4ds_v5 - 4vCPU 32 GB RAM. And created a fresh map for my Garmin with OpenStreetMap data.

    Downloading some 6 GB raw data went incredibly fast - oddly much faster than the .Net installation files. After that, data have to be reformatted and merged. CPU and memory were hardly used, but the poor spinning rust disk was at 100%. Then the merged file gets split into pieces suitable for Garmin, some 180 files. It started with disk at 100%, but then data were in memory and CPU went to 100%, still lots of free memory. Finally, the single files have to be translated into Garmin format and then merged. The disk was fast enough, CPU almost always at 100%, but hardly ever more than 20GB of memory used.

    The compute part (i.e. after download) took about half the time I am used to from my old machine (4 CPU, 8 GB RAM, old spinning rust).

    So perhaps next time, I add a fresh high-spec SSD just for the purpose of map creation, and afterwards delete it again. And the Bms type machines with 4 GB per vCPU seem to be good enough. Perhaps later (with a "pay-as-you-go" subscription, I might try a B16ms (16 vCPU, 64 GB) or something like that.

    Interesting to see where the limits of the processes can be found. And with the cloudz, they can be replaced with simple mouse clicks (except for the system disk...).



  • @BernieTheBernie said in Azure bites:

    app compiled with .Net 2

    The WHAT‽


    At a speed of 3.6 Mbit/s.
    In the Mcirosoft Clouds when downloading files from Microsoft.

    I have this similar experience with the container registry. The app we are setting up uses neural network(s), so the docker images are rather large, and when kubernetes is pulling the largest one, weighing ~21 GB, from the azure container registry, it almost always throws a permission error couple of times before succeeding. I suspect—without confirmation—that with the download taking over 15 minutes, the access token it uses expires during the download, which makes it fail—though fortunately enough progress has been made that after refreshing the token it can continue and eventually succeed. It just means a new node takes around an hour 😱 to start.



  • @BernieTheBernie I didn't find a good description of what the difference between the various “series” of machines is supposed to be. Except one—the B series apparently have burst quota on the disk access, giving you low average access rate, but a lot higher momentary access rate if you don't keep it for too long—which is quite common, so I almost always pick those these days.



  • @Bulb said in Azure bites:

    The WHAT‽

    I still have a major program of mine compiled with .Net 2 too.
    Why?
    Because I got a Visual Studio 2005 pro from attending a school partnered with Microsoft. Permanent CD key, no user account, only problem is C# 2.0 is quite unwieldy compared to later versions.

    I plan to get VS Code with the .NET Core SDK so I can learn .NET Core and port that project of mine, but... time and energy, man.


  • And then the murders began.

    @Medinoc said in Azure bites:

    I plan to get VS Code with the .NET Core SDK so I can learn .NET Core and port that project of mine, but... time and energy, man.

    Why not start by using VS Community 2022 and bring it up to .NET Framework 4.8? Lot simpler than a .NET Core port and you don’t have to worry about the 2029 EOL for .NET Framework 3.5.

    The tooling in VS2022 might also make the eventual .NET Core port easier.



  • @Unperverted-Vixen Could be... Does VS Community require logging in?


  • Banned

    @Medinoc yes but so does your phone. What's your point?



  • @Gustav It might have picked the account from Windows, but I don't remember logging in to the VS2022 community last time I installed it.


  • Discourse touched me in a no-no place

    It nags you about signing in but it's not required.



  • @Unperverted-Vixen said in Azure bites:

    don’t have to worry about the 2029 EOL for .NET Framework 3.5.

    It should actually be fine to have a program in .NET 2.0 if you didn't have to modify it for ages, just like it is perfectly fine to have a program written in C89 compiled and linked against the old libc that came with VC++ 6.0, because that's still included in Windows, or in Java 1.1, because that still runs on recent JRE. Because contrary to the popular belief, software does not actually bit-rot. It is the need to download older runtime, because the newer runtime does not support the older software, that is the :wtf: here.


  • Banned

    @Bulb said in Azure bites:

    @Gustav It might have picked the account from Windows, but I don't remember logging in to the VS2022 community last time I installed it.

    It does pick it up from Windows. I didn't have MS account, I installed VS, it asked to log in, and now I'm logged in system-wide.


  • Banned

    @loopback0 said in Azure bites:

    It nags you about signing in but it's not required.

    I'm pretty sure it was required at least at some point between 2016 and 2022.


  • And then the murders began.

    @Bulb I agree that Microsoft dropped the ball when .NET 2.0 and 4.0 had hard runtime breaks. But there's nothing I can do to fix that.



  • @Bulb said in Azure bites:

    It should actually be fine to have a program in .NET 2.0

    That's what my favorite PDF printer is in. Always have to remember to go into Settings to install .net from the wherever they've hidden it now.


  • Notification Spam Recipient

    @BernieTheBernie said in Azure bites:

    and then I resized the machine to the maximum size available to me currently:

    But.... why?

    @BernieTheBernie said in Azure bites:

    created a fresh map for my Garmin with OpenStreetMap data.

    Did you really need it Right Now ASAP Expedite?

    @BernieTheBernie said in Azure bites:

    CPU and memory were hardly used, but the poor spinning rust disk was at 100%.

    Yeah you probably ran out of IOPs credit or whatever.

    @BernieTheBernie said in Azure bites:

    Interesting to see where the limits of the processes can be found. And with the cloudz, they can be replaced with simple mouse clicks (except for the system disk...).

    I suppose profiling a program outside the cloudz should be more efficient, but hey, free trial is free trial I suppose....


  • Notification Spam Recipient

    @Medinoc said in Azure bites:

    @Unperverted-Vixen Could be... Does VS Community require logging in?

    0b28539d-f4ec-43d8-853f-caeeb7ca133c-image.png

    I'm not. :mlp_shrug:


  • Notification Spam Recipient

    @Bulb said in Azure bites:

    It is the need to download older runtime, because the newer runtime does not support the older software, that is the :wtf: here.

    Ostensibly it's to reduce the available attack surface, but I don't see it.



  • @Tsaukpaetra said in Azure bites:

    I suppose profiling a program outside the cloudz should be more efficient, but hey, free trial is free trial I suppose....

    You should also consider that my machines at home are real vintage hardware. And yes, I know whom I tell that: you have a strong competitor, believe it or not!



  • @Tsaukpaetra said in Azure bites:

    I'm not. :mlp_shrug:

    No, but you're you and can generally be assumed to be in need of a reboot. :trollface:


Log in to reply