| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
It’s referenced in id.txt, but I forgot to add it.
|
|
|
|
|
|
| |
I’ve been expanding the list somewhat, it’s time to put them on the
website. Moves them into a separate file, which was easier than
expected.
|
| |
|
|
|
|
|
|
|
| |
- Add Paris as location
- bump date
- Change XMPP account to headcounter.org
- clearsign
|
|
|
|
|
|
| |
Those tell the browser that it’s going to need them later, even it
hasn’t found them yet (e.g. the fonts can only be found after loading
the CSS).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
I'm not familiar with the "easy-dhall-nix" project, but the repository
is imported into Vuizvui via import-from-derivation. While this by
itself is not a big issue (apart from contributing hugely to evaluation
time, we're already at around an hour), the "dhall-nix" derivation is in
turn imported again via importDhall, so whenever something breaks with
dhall-nix, our evaluation will break as well.
Unfortunately, something is broken right now:
building '/nix/store/c363947v9qk287d07qj2kpj60rmzwalj-dhall-nix-1.1.14-x86_64-linux.tar.bz2.drv'...
trying https://github.com/dhall-lang/dhall-haskell/releases/download/1.32.0/dhall-nix-1.1.14-x86_64-linux.tar.bz2
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 648 100 648 0 0 2592 0 --:--:-- --:--:-- --:--:-- 2581
100 2255k 100 2255k 0 0 1639k 0 0:00:01 0:00:01 --:--:-- 6287k
hash mismatch in fixed-output derivation '/nix/store/yhls1ffnvp1nbjsm0xr3l1z6j6x4waqh-dhall-nix-1.1.14-x86_64-linux.tar.bz2':
wanted: sha256:1j32jf0is0kikfw7h9w3n8ikw70bargr32d1cyasqgmb7s7mvs1c
got: sha256:1qs5p05qfk5xs1ajwyhn27m0bzs96lnlf3b4gnkffajhaq7hiqll
cannot build derivation '/nix/store/aj5ag721b9gm4an6yxh2ljg19ixg4alv-dhall-nix-simple.drv': 1 dependencies couldn't be built
The reason why this happens is because GitHub's tarballs are not
deterministic and whenever GitHub changes something in the way these are
generated, we get a hash mismatch.
For exactly that reason, the fetchFromGitHub wrapper in nixpkgs uses
fetchzip instead of fetchurl, so that file ordering in the archive
doesn't matter.
Unfortunately, the upstream project still uses fetchurl, but since the
URLs and hashes have changed due to a bump to Dhall version 1.33.1, I've
choosen to switch to latest master instead of monkeypatching via
extraPostFetch.
With this bandaid, we shouldn't run into a hash collision until either
the next GC or until the upstream project has switched to either
fetchFromGitHub or fetchzip.
Signed-off-by: aszlig <aszlig@nix.build>
Cc: @Profpatsch, @justinwoo
|
| |
|
| |
|
|
|
|
|
|
|
|
|
| |
I want to be able to open http(s) links that are e.g. images directly
in the right application. Aka web urls should be transparent, instead
of always opening everthing in the browser.
This adds some silly ways of connecting to the server and parsing
out the headers, in order to fetch the content-type.
|
|
|
|
|
|
|
|
| |
This is an experiment about whether we can get away with using the
non-recursive version by default.
The U::Record variant uses a Vec instead of a HashMap by default, to
make encoding from lists easier, and keep the ordering as given.
|
| |
|
|
|
|
|
|
|
|
| |
It’s a lot simpler to just export the parsed attribute as envvars.
Remove the substitute stuff (it already went into the el_substitute
lib anyway) and replace the xpathexec0 code with the function from the
el_exec lib.
|
|
|
|
|
| |
A small parser for http/https URLs.
Substitutes host/port/path in argv.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
This was previously located verbatim on my webserver.
Since `df.eu` thought it was a good idea to unilaterally cancel it
when I moved my domain, it is now a good idea to nixify what was
there.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The C implementation of el_semicolon in execline only reads one
argument at a time and returns an index into the rest of argv.
This makes sense for the usual application of programs using it, which
is just reading a few arguments and a block or two, and then executing
into `prog`. `prog` could be anything really, including additional
blocks.
The new `el_semicolon_full_argv` function exports the previous
behaviour of parsing the whole thing.
As a nice side-effect, we return the rest of argv in-place.
|
|
|
|
|
|
|
|
|
|
|
| |
el_exec: wraps the various execve wrappers in skalib that are useful
for writing execline-like utils. currently only `xpathexec0` is
supported, which execs into the argv you give it or errors with the
right error if file not found.
el_substitute: execline argv substitution! Wraps the execline
function, so it will behave exactly the same as the existing execline
utils, like `importas`.
|
| |
|
|
|
|
|
| |
We can define a more or less complete generator in less than 50 lines
of nix. Nice.
|
| |
|
| |
|
|
|
|
|
| |
Instead of adding a new type, it just uses the 2^1 natural, which has
exactly two possibilities.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The “shallow” parser uses the fact that every netencode value is
length-encoded (or a scalar with a fixed length).
It does not need to parse the inner values in order to get the
structure of the thing.
That means that we can implement very fast structure-based operations,
like “take the first 5 elements of a list” or “get the record value
with the key name `foo`”.
We can even do things like intersperse elements into a list of values
and write the resulting netencode structure to a socket, without ever
needing to copy the data (it’s all length-indexed pointers to bytes).
|
|
|
|
|
| |
Less generic, has the spirit of “netstrings, but extended to a
structured encoding format”.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
https://dotti.me/
DOT-TIME(7) TIME FORMATS DOT-TIME(7)
NAME
dot-time - a universal convention for conveying time
DESCRIPTION
For those of us who travel often or coordinate across many timezones,
working with local time is frequently impractical. ISO8601, in all its
wisdom, allows for time zone designators, but still represents the
hours and minutes as local time, thus making it inconvenient for
quickly comparing timestamps from different locations.
Dot time instead uses UTC for all date, hour, and minute indications,
and while it allows for time zone designators, they are optional infor‐
mation that can be dropped without changing the indicated time. It uses
an alternate hour separator to make it easy to distinguish from regular
ISO8601. When a time zone designator is provided, one can easily obtain
the matching local time by adding the UTC offset to the UTC time.
EXAMPLES
These timestamps all represent the same point in time.
┌─────────────────────┬─────────────────────┐
│ dot time │ ISO8601 │
├─────────────────────┼─────────────────────┤
│ 2019-06-19T22·13-04 │ 2019-06-19T18:13-04 │
├─────────────────────┼─────────────────────┤
│ 2019-06-19T22·13+00 │ 2019-06-19T22:13+00 │
├─────────────────────┼─────────────────────┤
│ 2019-06-19T22·13+02 │ 2019-06-20T00:13+02 │
└─────────────────────┴─────────────────────┘
2019-06-19 DOT-TIME(7)
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
Uses the nom parsing combinator library.
|
| |
|
|
|
|
|
|
|
|
|
| |
Change the number format to be more concise, working in steps of 2^n,
going from 2^1 (1 bit) to 2^9 (512 bits), though implementations are
free to define the biggest numbers they want to support.
Records get the marker `{` and are closed by `}`, so parens match up
nicely, similar to lists.
|
| |
|
|
|
|
|
|
|
|
| |
This is now on par with the original script in
https://github.com/Profpatsch/dotfiles/blob/a25c6c419525bef7ef5985f664b058dc9eb919e9/scripts/scripts/xdg-open
Eventually it should probably migrate away from a generated bash
script, but for now it’s fine.
|
| |
|
| |
|
| |
|
|
|
|
|
|
| |
Like a normal `import`, but for dhall files. `importDhall2` can
additionally handle dependencies and additional source files, though
the interface is not stable yet.
|
| |
|
| |
|
| |
|
| |
|
| |
|