This only works if credential guard has implemented a way to build a subsequent token/value from that secret. For things like basic auth the secret would need to eventually hit the userland process that needs it in some shape or form to then embed it in the HTTP payload which is plaintext.
> *nix fanboys were totes fine with wget and ls being an aliases in PowerShell for years but when they found out what PS is coming to Linux they made a biggest stink
The curl and wget aliases don’t exist on the PowerShell 7 version which is the cross platform one. Only the old powershell.exe builtin to Windows has these aliases and it’s worse today because curl.exe is builtin and the curl alias takes priority when you run just curl.
It's "Windows PowerShell" which would be forever v5.1 and "PowerShell" is v7+.
(we don't talk about "PowerShell Core")
> builtin and the curl alias takes priority when you run just curl
Yes, but again if somebody didn't bother to read the docs, read the output (it's very evident when you have some PS error vs. everything else - and people STILL don't bother to try to understand) and start bitching on the forums... see my previous comment.
And by the way: it was established quite early what the use of an aliases in the written code should be frowned upon, exactly for the reason what the aliases aren't stable and could be local. Aliases are the quick way when you are slapping something interactively in the CLI.
So wget/curl were added for the benefit of those *nix fanboys who needed something better than cmd.exe on Windows so they could start using PS faster and later adopt to a proper ways but instead of reading the docs they only rose the stink.
The runas command doesn’t elevate just runs as another user. This is a console executable that drives UAC and also provides a way to capture the stdout/stderr elevated process which isn’t natively possible today without your own wrapper.
Looks like Microsoft is trying to stop internal products from using MSI in favour of MSIX packages. MSIX is nice for interactive applications but has a few issues with dealing with system wide installations and automation scenarios that I doubt the PowerShell team will be able to solve by the time of the 7.7 release.
I don't mind NRT but I hate dealing with C# projects that haven't set < Nullable>Enable</Nullable> in their csproj. It's not perfect because I know at runtime it can still be nullable but it's nice when the compiler does most of the checks for you.
> They refused to invest in packaging to the extent that a separate company (astral) had to do it for them
uv didn't just happen in a vacuum, there has been lots of investment in the Python packaging ecosystem that has enabled it (and other tools) to try and improve the shortcomings of Python and packaging.
There's PEP 518 [1] for build requirements, PEP 600 [2] for manylinux wheels, PEP 621 [3] for pyproject.toml, PEP 656 [4] for musl wheels platform identifiers, PEP 723 [5] for inline script metadata.
Without all this uv wouldn't be a thing and we would be stuck with pip and setuptools or a bunch of more bandaid hacks on top making the whole thing brittle.
Obviously, but writing PEPs is not enough. Read through the comments under any Python thread here from the late 2010s to early 2020s. Just ~two years ago you couldn't talk about anything Python-related without discussion veering far offtopic to complain about packaging.
That's the thing, you don't have to :) While I think uv is a great tool and highly recommend it, you are more than welcome to use any of the other build backends or package management tools that fit your workstyle. By having these packaging PEPs (amongst) others, the ecosystem has been able to try out different approaches and most likely over time will consolidate on specific ones that work better than the others.
Anecdata, but uv served as a very good packaging mechanism for a Python library I had to throw on an in extremis box, one that is not connected to the Internet in any way, and one where messing with the system Python was verboten and Docker was a four-letter word.
There shouldn't be any difference between those two values. I'm not saying you are wrong and it didn't break but it's definitely surprising a parser would choke on that vs YAML itself being the problem.
Don't get me wrong I can empathise with whitespace formatting being annoying and having both forms be valid just adds confusion it's just surprising to see this was the problem.
& has no special behaviour in strings, backticks and $ on the other hand do. For example "&Some String&" and '&Some String&' are all the literal value `&Some String&`. Backticks and $ are special in double quoted strings as they are the escape character and variable reference chars respectively.
per-minute is really just a way to express the cost in a human friendly name. Doing per-hour, per-second, per-day could all result in the same total value just at a different number. If anything per-minute is better than per-hour as you won't be charge for minutes you don't use.
But why not make it "per GB Logs ingested" or "per triggered job" (or both)? These should reflect the points where GitHub also has costs - but not per minute.
> I don't knock it out of my head by having the wire catching on something
> Dealing with the cable and having to pack it back up when I'm done
> It auto connects to both my phone and laptop 99% of the time
> It easily swap between the 2 as I change the focus
Now they aren't perfect, charging can be a bit fiddly over time but they certainly are nicer than the normal headphones. Maybe you just aren't the target audience but clearly they are popular enough for most people.
reply