A következő címkéjű bejegyzések mutatása: linux. Összes bejegyzés megjelenítése
A következő címkéjű bejegyzések mutatása: linux. Összes bejegyzés megjelenítése

2019. július 24., szerda

Troubleshooting Wireguard VPN on Windows 10, Android and Linux

I have had my share of pain over the compexity / slowness / incompatibilities / vulnerabilities of using Cisco,
To me it seems the primary problem with Wireguard are twofold:
  1. Not having enough experience in the community (blog posts, walk-throughs, how-tos etc.) to set up all kinds of arrangements besides the usual site-to-site and cloud VPN jump-host.
  2. Clients offer less the adequate error messages that could help with debugging / troubleshooting.
  3. Clients across platforms are not consistent.
I am writing this for two reasons:
  • helping fellow users with similar situations
  • and to give feedback to the developers (will try to figure there to submit reports and which of the issues are known already).
Things to fix / disambiguate / document in the various WireGuard components:
  1. The Android client does not have the nice log viewer that is part of the Windows client - and that helped me to see what is (not) happening) 
  2. You can export the  log form the Android client is full of UI related Java messages, unlike the clean log of the Windows client - it really makes it very hard to comprehend what is going on.
  3. The Android client just disappears after a while (even with the PersistentKeepalive set to 25), so suddenly the VPN protection disappears without any notification. This did not happen to the OpenVPN Android client, so probably just have to tell Android not to evict / suspend the VPN software somehow.
  4. The error message "bad address" (Android client, creating configuration from scratch) is misleading or not informative enough: got it for example for 192.168.1.1/24 (should be /32 or 192.168.1.0/24) - could correct it automatically or at least be more informative telling you what is wrong.
  5. It is hard to figure where Wireguard is logging on linux with systemd.
    Is it logging at all?
    - Could not find any trace of the failed connection attempts, so it was really hard to tell, if my DNS, my port forwarding or my Wireguard config is wrong (was the latter).
    - Could not find messages about 192.168.1.2/24 being inaccessible (overridden) if there is a 192.168.1.3/24 peer afterwards, so have to use /32 peers even if the server interface address is communicating on a 192.168.1.1/24 address with both of the clients.
    - systemd startup log did not habve any relevant messages either
What my mistakes and symptoms were:
  1. Accidentally mixed up a private and a public key. Wiregoard just silently fails, does not tell you that there was a connection attempt but the key was wrong. Could have been any network related inaccessibility as well...
  2. Did not know how to configure the peer addresses each for /32 so that they don't interfere but both can communicate with the /24 server interface.

2013. október 7., hétfő

Big JPEG Compression and Statistic experiment

A short while ago I had to think about how to make images smaller, without loosing (any more) information. For PNGs it is easy: I am a long time user of IrfanView, which includes the PNGOUT plugin to optimize compression of the image information.
Unfortunately two color images (used frequently in document scanning, because they use 1 bit per pixel, 1bpp) are usually stored in TIFF, Group 4 compressed images and in the past I have found these to be smaller than the same image saved in 1bpp PNG from IrfanView using PNGOUT. I have found no tools to poke around with this compression, also, IrfanView tends to save larger TIFF G4 images, than what come out of scanning.

This made me tudn my attention to Grayscale (8bpp) images,  which are often saves in JPEG, even though that is a lossy compression (meaning the decompressed image will not be equivalent to what came out of the scanner, just will look like that).
It is very easy to make a JPEG smaller by lowering its quality during compression, but it will just increase the difference from the original image thus decrease the similarity to that. We want to keep as much information of the original image as we can, so I was only looking into improving the compression while keeping the encoded information intact.

This made me do a large scale experiment on all my fifty two thousand JPEG images: I compressed them with all the different methods I could find, and recorded the results. In a series of posts I will review these result, because I have found several interesting angles to them:
  1. How to check the integrity of all your photos in an automated way: easily done with a following these instructions! Basically, on linux install jpeginfo, then, in the designated folder perform: (let me know if you need windows instructions!)

    find -iname "*.jpg" -print0 | xargs -0 jpeginfo -c | grep -e WARNING -e ERROR
    
  2. How much space can you save by re-compressing your photos, _without_ losing a single pixel of information (nor the EXIF data)?
  3. Is it really worth all the extra hassle to create ultra-progressive JPEGs?
  4. What is an arithmetic compressed JPEG, how much size it gains and which applications can read it?
  5. Which cameras / camera makes have the best and the worst JPEG compression engines?
  6. How have the megapixels evolved along the years?
  7. How do re-compression gains change compared to image size (megapixels)?
  8. How to automate all these, and which problems need to be solved for this (e.g. how to create progressive, arithmetic coded JPEGs)? actual scripts and binaries where needed!
  9. Shall I look at my 30GB+ MJPEG movies as well? :)
For starters, let's see what cameras have produced my 51719 JPEGs: (yes, several images were taken with mobile phones)

Apple 562
BlackBerry 135
Canon 5928
Casio 214
Fuji 17552
HP 112
Kodak 234
Minolta 244
Motorola 2
Nikon 9783
Nokia 1430
Olympus 13813
Panasonic 698
Pentax 5
Samsung 236
Sony 276
#N/A 495

And a sneak peak into the total possible filesize / storage gains, and for you to have a quick answer:
  1. Originals: 135.6GB
  2. Re-compressing only the Huffman coding: 130.21GB, 3.78% total gain
  3. Re-compressing the Huffman coding, and storing a progressive JPEG: 123.67GB, 8.79% total gain, 4,82% gain over just Huffman optimization!
  4. Re-compressing the Huffman coding, and optimizing progressive JPEG storage (a.k.a. ultra-progressive JPEG):122.53GB, 9.64% total gain, only 0.85% gain over progressive JPEG
  5. Re-compressing using arithmetic compression instead of Huffman: 116.88GB, 13.81% total gain, 4.17% gain over the best possible Huffman compression!
  6. Re-compressing using arithmetic compression and storing a progressive JPEG: 114.72GB, 15.4% total gain, 1,59% gain over non-progressive arithmetic JPEG
  7. Re-compressing using arithmetic compression and and optimizing progressive JPEG storage (a.k.a. ultra-progressive arithmetic JPEG): 113.94GB, 15.97% total gain, only 0.58% gain over progressive arithmetic JPEG
Breakdown for each camera brand in the next post!

Questions welcome, as always! :)

2010. augusztus 19., csütörtök

xargs használata, ha speciális karakter vagy szóköz van a fájlnévben

Ez annyira megtetszett, hogy lefordítom:
find . -print0 | xargs -0

Forrás: Not So Frequently Asked Questions

Rendszeres olvasók