<?xml version="1.0" encoding="UTF-8" ?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
	<title>Tanguy Ortolo - Debian</title>
	<link>https://tanguy.ortolo.eu/blog/categorie2/debian</link>
	<language>en</language>
	<description>a blog about Debian and self-hosting</description>
<atom:link xmlns:atom="http://www.w3.org/2005/Atom" rel="self" type="application/rss+xml" href="https://tanguy.ortolo.eu/blog/feed.php" />
	<lastBuildDate>Tue, 21 Mar 2017 19:33:00 +0000</lastBuildDate>
	<generator>PluXml</generator>
	<item>
		<title>Bad support of ZIP archives with extra fields</title> 
		<link>https://tanguy.ortolo.eu/blog/article156/zip-extra-field</link>
		<guid>https://tanguy.ortolo.eu/blog/article156/zip-extra-field</guid>
		<description>&lt;p&gt;For sharing multiple files, it is often convenient to pack them
into an archive, and the most widely supported format to do so is
probably ZIP. Under *nix, you can archive a directory with Info-ZIP:&lt;/p&gt;

&lt;pre class=&quot;cli&quot;&gt;
% zip -r something.zip something/
&lt;/pre&gt;

&lt;p&gt;(When you have several files, it is recommended to archive them in a
directory, to avoid cluttering the directory where people will extract
them.)&lt;/p&gt;&lt;h2&gt;Unsupported ZIP archive&lt;/h2&gt;

&lt;p&gt;Unfortunately, while we would expect ZIP files to be widely
supported, I found out that this is not always the case, and I had
many recipients failing to open them under operating systems such as
iOS.&lt;/p&gt;

&lt;h2&gt;Avoid extra fields&lt;/h2&gt;

&lt;p&gt;That issue seems to be linked to the usage of extra file attributes,
that are enabled by default, in order to store Unix file metadata. The
field designed to store such extra attributes was designed from the
beginning so each implementation can take into account attributes it
supports and ignore any other ones, but some buggy ZIP implementation
appear not to function at all with them.&lt;/p&gt;

&lt;p&gt;Therefore, unless you actually need to preserve Unix file metadata,
you should avoid using extra fields. With Info-ZIP, you would have to
add the option &lt;code&gt;-X&lt;/code&gt;:&lt;/p&gt;

&lt;pre class=&quot;cli&quot;&gt;
% zip -rX something.zip something/
&lt;/pre&gt;</description>
		<pubDate>Tue, 21 Mar 2017 19:33:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Generate man pages for awscli</title> 
		<link>https://tanguy.ortolo.eu/blog/article153/awscli-manpages</link>
		<guid>https://tanguy.ortolo.eu/blog/article153/awscli-manpages</guid>
		<description>&lt;h2&gt;No man pages, but almost&lt;/h2&gt;

&lt;p&gt;The AWS Command Line Interface, which is available in Debian,
provides no man page. Instead, that tool has an integrated help system,
which allows you to run commands such as &lt;code class=&quot;command&quot;&gt;aws rds
    help&lt;/code&gt;, that, for what I have seen, generates some
reStructuredText, then converts it to a man page in troff format, then
calls troff to convert it to text with basic formatting, and eventually
passes it to a pager. Since this is close to what &lt;em&gt;man&lt;/em&gt; does, the
result looks like a degraded man page, with some features missing such
as the adaptation to the terminal width.&lt;/p&gt;

&lt;p&gt;Well, this is better than nothing, and better than what many
under-documented tools can offer, but for several reasons, it still
sucks: most importantly, it does not respect administrators&#039; habits and
it does not integrate with the system man database. You it does not
allow you to use commands such as &lt;code class=&quot;command&quot;&gt;apropos&lt;/code&gt;,
and you will get no man page name auto-completion from your shell since
there is no man page.&lt;/p&gt;&lt;h2&gt;Generate the man pages&lt;/h2&gt;

&lt;p&gt;Now, since the integrated help system does generate a man page
internally, we can hack it to output it, and save it to a file:&lt;/p&gt;

&lt;pre class=&quot;code diff patch&quot;&gt;
Description: Enable a mode to generate troff man pages
 The awscli help system internally uses man pages, but only to convert
 them to text and show them with the pager. This patch enables a mode
 that prints the troff code so the user can save the man page.
 .
 To use that mode, run the help commands with an environment variable
 OUTPUT set to &#039;troff&#039;, for instance:
     OUTPUT=&#039;troff&#039; aws rds help
Forwarded: no
Author: Tanguy Ortolo &amp;lt;tanguy+debian@ortolo.eu&amp;gt;
Last-Update: 2016-11-22

Index: /usr/lib/python3/dist-packages/awscli/help.py
===================================================================
--- /usr/lib/python3/dist-packages/awscli/help.py       2016-11-21 12:14:22.236254730 +0100
+++ /usr/lib/python3/dist-packages/awscli/help.py       2016-11-21 12:14:22.236254730 +0100
@@ -49,6 +49,8 @@
     Return the appropriate HelpRenderer implementation for the
     current platform.
     &quot;&quot;&quot;
+    if &#039;OUTPUT&#039; in os.environ and os.environ[&#039;OUTPUT&#039;] == &#039;troff&#039;:
+        return TroffHelpRenderer()
     if platform.system() == &#039;Windows&#039;:
         return WindowsHelpRenderer()
     else:
@@ -97,6 +99,15 @@
         return contents


+class TroffHelpRenderer(object):
+    &quot;&quot;&quot;
+    Render help content as troff code.
+    &quot;&quot;&quot;
+
+    def render(self, contents):
+        sys.stdout.buffer.write(publish_string(contents, writer=manpage.Writer()))
+
+
 class PosixHelpRenderer(PagingHelpRenderer):
     &quot;&quot;&quot;
     Render help content on a Posix-like system.  This includes
&lt;/pre&gt;

&lt;p&gt;This patch must be applied from the root directory with &lt;code
     class=&quot;command&quot;&gt;patch -p0&lt;/code&gt;, otherwise GNU patch will not
accept to work on files with absolute names.&lt;/p&gt;

&lt;p&gt;With that patch, you can run help commands with an environment
variable &lt;code class=&quot;environment variable&quot;&gt;OUTPUT=&#039;troff&#039;&lt;/code&gt; to get
the man page to use it as you like, for instance:&lt;/p&gt;

&lt;pre class=&quot;cli&quot;&gt;
% OUTPUT=&#039;troff&#039; aws rds help &amp;gt; aws_rds.1
% man -lt aws_rds.1 | lp
&lt;/pre&gt;

&lt;h2&gt;Generate all the man pages&lt;/h2&gt;

&lt;p&gt;Now that we are able to generate the man page of any aws command,
all we need to generate all of them is a list of all the available
commands. This is not that easy, because the commands are somehow
derived from functions provided by a Python library named botocore,
which are derived from a bunch of configuration files, and some of them
are added, removed or renamed. Anyway, I have been able to write a
Python script that does that, but it includes a static list of these
modifications:&lt;/p&gt;

&lt;pre class=&quot;python code&quot;&gt;
#! /usr/bin/python3

import subprocess
import awscli.clidriver


def write_manpage(command):
    manpage = open(&#039;%s.1&#039; % &#039;_&#039;.join(command), &#039;w&#039;)
    command.append(&#039;help&#039;)
    process = subprocess.Popen(command,
            env={&#039;OUTPUT&#039;: &#039;troff&#039;},
            stdout=manpage)
    process.wait()
    manpage.close()


driver = awscli.clidriver.CLIDriver()
command_table = driver._get_command_table()

renamed_commands = \
    {
        &#039;config&#039;: &#039;configservice&#039;,
        &#039;codedeploy&#039;: &#039;deploy&#039;,
        &#039;s3&#039;: &#039;s3api&#039;
    }
added_commands = \
    {
        &#039;s3&#039;: [&#039;cp&#039;, &#039;ls&#039;, &#039;mb&#039;, &#039;mv&#039;, &#039;presign&#039;, &#039;rb&#039;, &#039;rm&#039;, &#039;sync&#039;,
               &#039;website&#039;]
    }
removed_subcommands = \
    {
        &#039;ses&#039;: [&#039;delete-verified-email-address&#039;,
                &#039;list-verified-email-addresses&#039;,
                &#039;verify-email-address&#039;],
        &#039;ec2&#039;: [&#039;import-instance&#039;, &#039;import-volume&#039;],
        &#039;emr&#039;: [&#039;run-job-flow&#039;, &#039;describe-job-flows&#039;,
                &#039;add-job-flow-steps&#039;, &#039;terminate-job-flows&#039;,
                &#039;list-bootstrap-actions&#039;, &#039;list-instance-groups&#039;,
                &#039;set-termination-protection&#039;,
                &#039;set-visible-to-all-users&#039;],
        &#039;rds&#039;: [&#039;modify-option-group&#039;]
    }
added_subcommands = \
    {
        &#039;rds&#039;: [&#039;add-option-to-option-group&#039;,
                &#039;remove-option-from-option-group&#039;]
    }

# Build a dictionary of real commands, including renames, additions and
# removals.
real_commands = {}
for command in command_table:
    subcommands = []
    subcommand_table = command_table[command]._get_command_table()
    for subcommand in subcommand_table:
        # Skip removed subcommands
        if command in removed_subcommands \
                and subcommand in removed_subcommands[command]:
            continue
        subcommands.append(subcommand)
    # Add added subcommands
    if command in added_subcommands:
        for subcommand in added_subcommands[command]:
            subcommands.append(subcommand)
    # Directly add non-renamed commands
    if command not in renamed_commands:
        real_commands[command] = subcommands
    # Add renamed commands
    else:
        real_commands[renamed_commands[command]] = subcommands
# Add added commands
for command in added_commands:
    real_commands[command] = added_commands[command]

# For each real command and subcommand, generate a manpage
write_manpage([&#039;aws&#039;])
for command in real_commands:
    write_manpage([&#039;aws&#039;, command])
    for subcommand in real_commands[command]:
        write_manpage([&#039;aws&#039;, command, subcommand])
                         &#039;sync&#039;, &#039;website&#039;]}
&lt;/pre&gt;

&lt;p&gt;This script will generate more than 2,000 man page files in the
current directory; you will then be able to move them to &lt;em
     class=&quot;filename&quot;&gt;/usr/local/share/man/man1&lt;/em&gt;.&lt;/p&gt;


&lt;p&gt;Since this is a lot of man pages, it may be appropriate to
concatenate them by major command, for instance all the &lt;code
     class=&quot;command&quot;&gt;aws rds&lt;/code&gt; together…&lt;/p&gt;</description>
		<pubDate>Wed, 23 Nov 2016 17:25:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Process command line arguments in shell</title> 
		<link>https://tanguy.ortolo.eu/blog/article150/shell-process-arguments</link>
		<guid>https://tanguy.ortolo.eu/blog/article150/shell-process-arguments</guid>
		<description>&lt;p&gt;When writing a wrapper script, one often has to process the command
line arguments to transform them according to his needs, to change some
arguments, to remove or insert some, or perhaps to reorder them.&lt;/p&gt;&lt;h2&gt;Naive approach&lt;/h2&gt;

&lt;p&gt;The naive approach to do that is&lt;a href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#note1&quot; id=&quot;call1&quot;&gt;¹&lt;/a&gt;:&lt;/p&gt;

&lt;pre class=&quot;code script shell&quot;&gt;
# Process arguments, building a new argument list
new_args=&quot;&quot;
for arg in &quot;$@&quot;
do
    case &quot;$arg&quot;
    in
        --foobar)
            # Convert --foobar to the new syntax --foo=bar
            new_args=&quot;$args --foo=bar&quot;
        ;;
        *)
            # Take other options as they are
            new_args=&quot;$args $arg&quot;
        ;;
    esac
done

# Call the actual program
exec program $new_args
&lt;/pre&gt;

&lt;p&gt;This naive approach is simple, but fragile, as it will break on
arguments that contain a space. For instance, calling &lt;code
    class=&quot;shell&quot;&gt;wrapper --foobar &quot;some file&quot;&lt;/code&gt; (where &lt;code&gt;some
    file&lt;/code&gt; is a single argument) will result in the call &lt;code
    class=&quot;shell&quot;&gt;program --foo=bar some file&lt;/code&gt; (where
&lt;code&gt;some&lt;/code&gt; and &lt;code&gt;file&lt;/code&gt; are two distinct arguments).&lt;/p&gt;

&lt;h2&gt;Correct approach&lt;/h2&gt;

&lt;p&gt;To handle spaces in arguments, we need either:&lt;/p&gt;

&lt;ul&gt;
    &lt;li&gt;to quote them in the new argument list, but that requires
        escaping possible quotes they contain, which would be
        error-prone, and implies using external programs such as
        sed;&lt;/li&gt;
    &lt;li&gt;to use an actual list or array, which is a feature of advanced
        shells such as Bash or Zsh, not standard shell…&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;… except standard shell does support arrays, or rather, it does
support &lt;em&gt;one specific array&lt;/em&gt;: the positional parameter list
&lt;code class=&quot;shell&quot;&gt;&quot;$@&quot;&lt;/code&gt;&lt;a href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#note2&quot; id=&quot;call2&quot;&gt;²&lt;/a&gt;. This
leads to one solution to process arguments in a reliable way, which
consists in rebuilding the positional parameter list with the built-in
command &lt;code class=&quot;shell&quot;&gt;set --&lt;/code&gt;:&lt;/p&gt;

&lt;pre class=&quot;code script shell&quot;&gt;
# Process arguments, building a new argument list in &quot;$@&quot;
# &quot;$@&quot; will need to be cleared, not right now but on first iteration only
first_iter=1
for arg in &quot;$@&quot;
do
    if [ &quot;$first_iter&quot; -eq 1 ]
    then
        # Clear the argument list
        set --
        first_iter=0
    fi
    case &quot;$arg&quot;
    in
        --foobar) set -- &quot;$@&quot; --foo=bar ;;
        *) set -- &quot;$@&quot; &quot;$arg&quot; ;;
    esac
done

# Call the actual program
exec program &quot;$@&quot;
&lt;/pre&gt;

&lt;h2&gt;Notes&lt;/h2&gt;

&lt;ol&gt;
    &lt;li id=&quot;note1&quot;&gt;I you prefer, &lt;code class=&quot;shell&quot;&gt;for arg in &quot;$@&quot;&lt;/code&gt; can be
        simplified to just &lt;code class=&quot;shell&quot;&gt;for arg&lt;/code&gt;.&lt;a
            href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#call1&quot;&gt;↑&lt;/a&gt;&lt;/li&gt;
    &lt;li id=&quot;note2&quot;&gt;As a reminder, and contrary to what it looks like, quoted
        &lt;code class=&quot;shell&quot;&gt;&quot;$@&quot;&lt;/code&gt; does not expand to a single
        field, but to &lt;em&gt;one field per positional parameter&lt;/em&gt;. &lt;a
            href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#call2&quot;&gt;↑&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;</description>
		<pubDate>Wed, 08 Jun 2016 13:29:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Let&#039;s Encrypt: threat or opportunity to other certificate authorities?</title> 
		<link>https://tanguy.ortolo.eu/blog/article146/letsencrypt-opportunity-other-cas</link>
		<guid>https://tanguy.ortolo.eu/blog/article146/letsencrypt-opportunity-other-cas</guid>
		<description>&lt;p&gt;&lt;a href=&quot;https://letsencrypt.org/&quot; title=&quot;Let&#039;s Encrypt
website&quot;&gt;Let&#039;s Encrypt&lt;/a&gt; is a certificate authority (CA) that just
left beta stage, that provides &lt;a
    href=&quot;https://en.wikipedia.org/wiki/Domain-validated_certificate&quot;
    title=&quot;Wikipedia article about domain-validated certificates&quot;&gt;domain
    name-validated&lt;/a&gt; (DV) X.509 certificates for free and in an
automated way: users just have to run a piece of software on their
server to get and install a certificate, resulting in a valid TLS
setup.&lt;/p&gt;

&lt;div class=&quot;figure&quot; style=&quot;text-align: center; margin: 1em;&quot;&gt;
    &lt;a href=&quot;https://letsencrypt.org/&quot; title=&quot;Let&#039;s Encrypt&quot;&gt;
        &lt;object type=&quot;image/svg+xml&quot; data=&quot;https://tanguy.ortolo.eu/blog/data/images/logos/letsencrypt.svg&quot;&gt;Let&#039;s Encrypt logo&lt;/object&gt;
    &lt;/a&gt;
&lt;/div&gt;&lt;h2&gt;A threat to other certificate authorities&lt;/h2&gt;

&lt;p&gt;By providing certificates for free and automatically, Let&#039;s Encrypt
is probably a threat a other CAs, a least for part of their activity.
Indeed, for people that are satisfied with DV certificates, there are
not many reasons to pay a commercial CA to get certificates in a
non-automated way. For the &lt;a href=&quot;https://www.cacert.org/&quot; title=&quot;CAcert
website&quot;&gt;CAcert&lt;/a&gt; non-commercial CA, that may mean a slow death, as
this is their main activity&lt;a href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#note1&quot;&gt;¹&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;For people that want &lt;a
    href=&quot;https://en.wikipedia.org/wiki/Public_key_certificate#Validation_levels&quot;
    title=&quot;Wikipedia article about public key certificates, section
    about validation levels&quot;&gt;organization-validated (OV) or extended
    validation (EV)&lt;/a&gt; certificates, Let&#039;s Encrypt is not suitable, so
it will not change anything regarding that.&lt;/p&gt;

&lt;h2&gt;An opportunity for the most reactive&lt;/h2&gt;

&lt;p&gt;The entrance of Let&#039;s Encrypt is also a significant opportunity for
the certificate authorities that will be reactive enough to take
advantage of their innovation. Indeed, they introduced automation in
both domain name validation and certificate issuance (and revocation),
by defining &lt;a href=&quot;https://github.com/ietf-wg-acme/acme/&quot;
    title=&quot;Development of the ACME protocol on GitHub&quot;&gt;an open
    protocol&lt;/a&gt; that is meant to become an Internet standard. That
protocol, named ACME, is not tied to Let&#039;s Encrypt and has &lt;a
    href=&quot;https://github.com/letsencrypt/letsencrypt/wiki/Links&quot;
    title=&quot;List of Let&#039;s Encrypt and ACME implementations&quot;&gt;several free
    software implementations&lt;/a&gt;, so it could be used for the same
purpose by commercial CAs.&lt;/p&gt;

&lt;p&gt;A certification authority could, for instance:&lt;/p&gt;

&lt;ul&gt;
    &lt;li&gt;ask the customer to provision some pre-paid account;&lt;/li&gt;
    &lt;li&gt;manually validate the customer&#039;s identity once;&lt;/li&gt;
    &lt;li&gt;allow the customer to register using ACME, and associate that
        registration to his validated identity;&lt;/li&gt;
    &lt;li&gt;allow the customer to get organization-validated, or perhaps
        even extended validation certificates using ACME, and making
        corresponding debits to his pre-paid account.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Such processes may require or benefit from improvements of the ACME
protocol, which is the very reason Internet standards are defined in an
open way.&lt;/p&gt;

&lt;p&gt;The first certification authority that would implement such a process
could gain an advantage over its competitors, as it would greatly
simplify getting and renewing certificates. I think even Let&#039;s Encrypt
people would be happy to see that happen, as it would serve their goal,
that is basically to help securing the Internet! Personally, I could buy
such a service (assuming it is not restricted to juridical persons,
according to a quite common (and detestable) sale discrimination against
natural persons&lt;a href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#note2&quot; id=&quot;noteref2&quot;&gt;²&lt;/a&gt;).&lt;/p&gt;

&lt;h2&gt;Notes&lt;/h2&gt;

&lt;ol&gt;
    &lt;li id=&quot;note1&quot;&gt;CAcert is an unrecognised certificate authority, that
        provides an identity validation through a web of trust, and
        issues DV server certificates that do not include the validated
        identity. Now that Let&#039;s Encrypt can issue valid DV
        certificates, CAcert is no longer relevant for that activity. It
        also issues personal certificates, that do include the
        validated identity, and that can be used for encryption (e.g.
        S/MIME), signing (e.g. code signing) or authentication, which is
        an activity Let&#039;s Encrypt does not compete with.&lt;/li&gt;
    &lt;li id=&quot;note2&quot;&gt;Yes, the Organization field of a certificate is
        probably not relevant to indicate a physical person&#039;s name, but
        the CommonName field is. Yes, that field is usually abused to
        store the domain name, but a proper use would be to put the
        owner&#039;s name in the CommonName field, and the domain names in
        the subjectAltName field.&lt;a href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#citeref2&quot;&gt;↑&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;</description>
		<pubDate>Fri, 15 Apr 2016 13:25:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Removing sam2p from Debian</title> 
		<link>https://tanguy.ortolo.eu/blog/article143/removing-sam2p-from-debian</link>
		<guid>https://tanguy.ortolo.eu/blog/article143/removing-sam2p-from-debian</guid>
		<description>&lt;h2&gt;Issues with sam2p and removal&lt;/h2&gt;

&lt;p&gt;I have been maintaining the Debian package of sam2p for some time.
Unfortunately, the upstream development of that program is no longer
active, and it is using an old custom build chain that no longer works
with recent version of GCC.&lt;/p&gt;

&lt;p&gt;This package is currently failing to build from source, and while I
have been able to patch some issues in the past, and it may still be
possible to fix it again, this is not really sustainable.&lt;/p&gt;

&lt;p&gt;I am therefore considering to remove sam2p from Debian, unless
someone has a very good reason to keep it and is able and willing to
maintain it.&lt;/p&gt;&lt;h2&gt;Alternative&lt;/h2&gt;

&lt;p&gt;sam2p is a raster image conversion tool that can convert PNG and JPEG
to EPS and PDF while keeping their compression, which is mostly useful
to use them in documents compiled with LaTeX. Fortunately, the same can
be done with ImageMagick. If you want to convert to EPS, you have to
specify that you want EPS 2 or 3, otherwise it would produce EPS level 1
which does not provide native raster compression:&lt;/p&gt;

&lt;pre class=&quot;cli&quot;&gt;
% convert debian-openlogo-raster100.png \
          eps3:debian-openlogo-raster100.eps
% convert debian-openlogo-raster100.png \
          debian-openlogo-raster100.pdf
% ls -lh
1.7K debian-openlogo-raster100.png
6.0K debian-openlogo-raster100.eps
8.8K debian-openlogo-raster100.pdf

% convert photograph.jpg eps3:photograph.eps
% convert photograph.jpg photograph.pdf
% ls -lh
657K photograph.jpg
662K photograph.eps
664K photograph.pdf

% convert scanned-document.png eps3:scanned-document.eps
% convert scanned-document.png scanned-document.pdf
140K scanned-document.png
145K scanned-document.eps
150K scanned-document.pdf
&lt;/pre&gt;

&lt;p&gt;This is a bit less efficient than sam2p, as convert seems to add some
fixed overhead, but it does keep the appropriate compression algorithm.
See this &lt;a href=&quot;http://www.imagemagick.org/Usage/formats/#pdf_options&quot;
            title=&quot;ImageMagick PDF output options&quot;&gt;documentation
            page from ImageMagick&lt;/a&gt; for more information.&lt;/p&gt;

&lt;h2&gt;Using appropriate formats&lt;/h2&gt;

&lt;p&gt;As a reminder, when writing LaTeX documents, depending on your build
chain, you can use:&lt;/p&gt;

&lt;dl&gt;
    &lt;dt&gt;photographs&lt;/dt&gt;
    &lt;dd&gt;JPEG or EPS (converted from JPEG with ImageMagick);&lt;/dd&gt;
    &lt;dt&gt;raster drawings, screenshots…&lt;/dt&gt;
    &lt;dd&gt;PNG or EPS (converted from PNG with ImageMagick);&lt;/dd&gt;
    &lt;dt&gt;vector graphics&lt;/dt&gt;
    &lt;dd&gt;PDF or EPS (convertes from SVG with Inkscape).&lt;/dd&gt;
&lt;/dl&gt;</description>
		<pubDate>Fri, 22 Jan 2016 00:52:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Scale manufacturers…</title> 
		<link>https://tanguy.ortolo.eu/blog/article137/scale-manufacturers</link>
		<guid>https://tanguy.ortolo.eu/blog/article137/scale-manufacturers</guid>
		<description>&lt;p&gt;Dear manufacturers of kitchen scales, could you please stop
considering your clients as idiots, and start developing &lt;em&gt;useful&lt;/em&gt;
features?&lt;/p&gt;&lt;p&gt;&lt;em&gt;Liquid measurement:&lt;/em&gt; this is one feature that is
available on almost every electronic scale available. Except it is
completely useless to people that use the metric system, as all it does
is replace the usual display in &lt;em&gt;grammes&lt;/em&gt; by &lt;em&gt;centilitres&lt;/em&gt;
and divide the number on display by ten. Thank you, but no person that
has been to school in a country that uses the metric system needs
electronic assistance to determine the volume corresponding to a given
weight of water, and for people that have not, a simple note written on
the scale, stating that “for water or milk, divide the weight in grammes
by ten to get the volume in centilitres” should be enough.&lt;/p&gt;

&lt;p&gt;Now, there is still one thing that an electronic scale could be
useful for, which is determining the volume of liquids other than water
(density 1 g/ml) or milk (density approx. equal to 1 g/ml), most
importantly: oil (density approx. equal to .92 g/ml for edible oils like
sunflower, peanut, olive and canola).&lt;/p&gt;</description>
		<pubDate>Mon, 26 Jan 2015 14:54:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Proof of address: use common sense!</title> 
		<link>https://tanguy.ortolo.eu/blog/article135/proof-of-address</link>
		<guid>https://tanguy.ortolo.eu/blog/article135/proof-of-address</guid>
		<description>&lt;p&gt;As I have just moved to a new home, I had to declare my new address
to all my providers, including banks and administrations which require
a proof of address, which can be a phone, DSL or electricity bill.&lt;/p&gt;

&lt;p&gt;Well, this is just stupid, as, by definition, one will only have a
bill after at least a month. Until then, that means the bank will keep a
false address, and that the mail they send may not be delivered to the
customer.&lt;/p&gt;&lt;p&gt;Now, bankers and employees of similar administrations, if you could
use some common sense, I have some information for you: when someone
moves to a new home, unless he is hosted by someone else, he is either
renter or owner. Well, you should now that a renter has one contract
that proves it, which is called a lease. And an owner has one paper that
proves it, which is called a title, or, before it has been issued by
administration, a certificate of sale. Now if you do not accept that as
a proof of address, you just suck.&lt;/p&gt;

&lt;p&gt;Besides, such a zeal to check one&#039;s address is just pointless, as it
is just to get a proof of address without waiting for a phone, DSL or
electricity bill (or to prove a false address, actually…) by just faking
one. And as a reminder, at least in France, forgery is punishable by law
but defined as an alteration of truth &lt;em&gt;which can cause a
    prejudice&lt;/em&gt;, which means modifying a previous electricity bill to
prove your actual address is &lt;em&gt;not&lt;/em&gt; considered as a forgery (but
using the same mean to prove a false address is, of course!).&lt;/p&gt;</description>
		<pubDate>Thu, 08 Jan 2015 12:54:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
	<item>
		<title>Using bsdtar to change an archive format</title> 
		<link>https://tanguy.ortolo.eu/blog/article134/bsdtar-change-archive-format</link>
		<guid>https://tanguy.ortolo.eu/blog/article134/bsdtar-change-archive-format</guid>
		<description>&lt;h2&gt;Streamable archive formats&lt;/h2&gt;

&lt;a href=&quot;http://tango.freedesktop.org/&quot; title=&quot;From the Tango Desktop project, Public Domain&quot;&gt;&lt;img src=&quot;https://tanguy.ortolo.eu/blog/data/images/icons/tango-archive-128.png&quot; alt=&quot;Package icon&quot; style=&quot;float: right;&quot;/&gt;&lt;/a&gt;

&lt;p&gt;Archive formats such as &lt;em class=&quot;format manpage&quot;&gt;tar(5)&lt;/em&gt; and &lt;em
    class=&quot;format manpage&quot;&gt;cpio(5)&lt;/em&gt; have the advantage of being streamable,
so you can use them for transferring data with pipes and remote shells,
without having to store the archive in the middle of the process, for instance:&lt;/p&gt;

&lt;pre class=&quot;cli&quot;&gt;
$ cd public_html/blog
$ rgrep -lF &quot;archive&quot; data/articles \
      | pax -w \
      | ssh newserver &quot;mkdir public_html/blog ;
                       cd public_html/blog ;
                       pax -r&quot;
&lt;/pre&gt;&lt;h2&gt;Turning a  ZIP archive into tarball&lt;/h2&gt;

&lt;p&gt;Unfortunately, many people will send you data in non-streamable archive
formats such as ZIP&lt;a href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#note1&quot; id=&quot;notecall1&quot;&gt;¹&lt;/a&gt;. For such cases, &lt;em
    class=&quot;command manpage&quot;&gt;bsdtar(1)&lt;/em&gt; can be useful, as it is able to
convert an archive from one format to another:&lt;/p&gt;

&lt;pre class=&quot;cli&quot;&gt;
$ bsdtar -cf - @archive.zip \
      | COMMAND
&lt;/pre&gt;

&lt;p&gt;These arguments tell &lt;em class=&quot;command&quot;&gt;bsdtar&lt;/em&gt; to:&lt;/p&gt;

&lt;ul&gt;
    &lt;li&gt;create an archive;&lt;/li&gt;
    &lt;li&gt;write it to stdout (contrary to GNU tar which defaults to stdout,
    bsdtar defaults to a tape device);&lt;/li&gt;
    &lt;li&gt;put into it the files it will find in the archive &lt;em
        class=&quot;filename&quot;&gt;archive.zip&lt;/em&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The result is a tape archive, which is easier to manipulate in a stream than
a ZIP archive.&lt;/p&gt;

&lt;h2&gt;Notes&lt;/h2&gt;

&lt;ol&gt;
    &lt;li id=&quot;note1&quot;&gt;Some will say that although ZIP is based on an file index, it
    can be stream because that index is placed at the end of the archive. In
    fact, that characteristic only allows to stream the archive creation, but
    requires to store the full archive before being able to extract it. &lt;a
        href=&quot;https://tanguy.ortolo.eu/blog/rss/categorie2#notecall1&quot;&gt;↑&lt;/a&gt;.&lt;/li&gt;
&lt;/ol&gt;</description>
		<pubDate>Tue, 09 Dec 2014 16:00:00 +0000</pubDate>
		<dc:creator>Tanguy</dc:creator>
	</item>
</channel>
</rss>