- wbadmin get disks - lists the disk in system and the space used by the backups
- wbadmin get versions - lists the backup versions
- wbadmin delete backup -keepVersions:10 - deletes older backups, keep the last 10 backups
- wbadmin delete backup -version:04/08/2017-10:16 - deletes the specified backup version
Sunday, April 9, 2017
Sunday, November 27, 2016
Here is the result: http://alinconstantin.com/download/browsercookies/cookies.html
- Chrome is silly, not allowing using cookies when scripts are run from file:// locations.
- IE & Edge have very low limits for total cookies size. You probably don't want to send 10k cookies with every http request, but I've seen websites hitting these limits... 15k would have been more reasonable (and closer to the 16k default limit of header sizes in server side). On the plus side, 5k per cookie is better than the ~4k of all other browsers (and I've seen websites hitting that limit in other browser, too)(
|Browser||Max bytes/cookie||Max cookies||Max total bytes for cookies|
|IE11 & Edge||5117||50||2*5117 = 10234|
|Chrome 54||4096||180||180*4096 = 737280|
|Firefox r50||4097||150||150*4097 = 614550|
|Opera 41||4096||180||180*4096 = 737280|
Thursday, November 24, 2016
RFC2965 will tell you a browser should support at least 20 cookies of size 4096 bytes per cookie, but browsers usually support higher limits. E.g. Chrome supports 180 cookies of size 4096 bytes, per domain, with no limits for the total size of all cookies. That makes 720Kb of data that is allowed by Chrome in each request.
Those limits depends from http server to server, and the server response if you make larger requests varies, too. Here are some examples:
- www.microsoft.com - throws SocketException / ConnectionForcefullyClosedByRemoteServer after ~16k max cookies
- portal.office.com - Starts returning "400 Bad Request – Request Too Long. HTTP Error 400. The size of the request headers is too long" after max ~15k cookies
www.google.com - Starts returning 413 Request Entity Too Large after ~15k cookies
- www.amazon.com - Starts returning 400 Bad Request after ~7.5k
- www.yahoo.com - Accepts requests up to ~65k, after that returns 400 Bad Request
- www.facebook.com - Accepts about ~80k after that starts returning 400, 502 or throws WebException/ (seems dependent on the number of cookies, too)
I wrote an app one can use to test and get an idea of the server limits. You can download it from
http://alinconstantin.com/Download/ServerCookieLimits.zip and invoke it with the http:// Uri of the server to test for parameter. The app makes requests to the server with cookies of various decreasing sizes, trying to narrow down the accepted max cookies size. The output looks like in the picture below.
Saturday, September 5, 2015
My WiFi router is a Netgear Nighthawk R8000, which boasts a 3.2Gbps WiFi speed. That's just for PR, in reality, it has one 2.4GHz band with max 600Mbps and two 5GHz channels, each supporting max 1300Mbps. So, the max speed of connection is limited to the max speed of the band I'm using. But, that's not the end of the story - both my Surface Pro 3 and my wife's laptop connect at maximum 866.5Mbps, and that's when staying 2-3 fests apart from the router. The speed is actually negotiated between the router and the client device. If I move 10 feet away, the speed starts dropping to 700Mbps. If I stay in living room, the speed drops to 80-90Mbps.
Surface 3 Pro has a 'Marvell AVASTAR Wireless-AC Network Controller' Wi-Fi adapter, and based on http://www.marvell.com/wireless/avastar/88W8897 it's maximum WiFi speed is 867Mbps.
I'm reaching this speed (so Microsoft kept its promise and fixed the low speed problem), but I have to be feet apart from the router to reach it. And even in these ideal conditions I'd need at least 4 Surfaces to saturate the two 5GHz channels plus more WiFi devices connecting on 2.4GHz to reach the advertised router's speed... The router speeds are just a PR gimmick.
Sunday, November 2, 2014
Today I spent almost one hour trying to figure out why Canon DPP was not able to edit some pictures I took a while ago. All the pictures in one folder were shown like this:
Notice the glyph on top of each image indicating editing was not allowed.
I searched 3 times the menus for some option to unblock editing, but there was none. I thought of files being read-only on disk, having wrong ACLs. Nothing. Some website suggested for images being edit-protected from the camera to access an unblock option from the Info window, but that was completely empty instead of displaying EXIF info.
It was only happening with images in one folder, so I moved an image out of that folder, but nothing changed.
Hours later I viewed the images in Explorer from a different computer, and then I noticed something odd – why was the CR2 size so small as compared with other pictures? Were they corrupted?
And then it hit me – when I took those pictures the camera battery run out on my 5D III and I had to use my old camera, a Canon 20D. And DPP was not able to open the files from this older camera…
After a little digging on the net, I had the confirmation: Canon has released Digital Photo Professional 4.0, but only for 64-bit computers and only for certain cameras like Canon 5D Mark III. Older camera like Canon 20D are not supported by DPP 4, and instead I had to download the previous version, DPP 3.14 to edit the raw files. It turns out that even new cameras from Canon like 7D mark II are not supported by DPP 4.0, on either 32 or 64-bit Windows. Hopefully Canon will reconsider and add compatibility support for all the cameras when they release a new version of DPP 4…
Thursday, September 18, 2014
This article describes in great details the steps
Sunday, June 1, 2014
I have two Netgear Wi-Fi routers that have Wi-Fi connections enabled with speeds up to 300Mbps. However, the laptop, tablet, etc connects to them with speeds usual in the 78-144Mbps range, never over 150Mbps. This didn’t bother me much as these speeds are still over my broadband connection speed (60Mpbs), and I don’t transfer many files between laptop and other computers in the network. But still, why does this happens?
The documentation says “The WNR3500 router will use the channel you selected as the primary channel and expand to the secondary channel (primary channel +4 or -4) to achieve a 40 MHz frame-by-frame bandwidth. The WNR3500 router will detect channel usage and will disable frame-by-frame expansion if the expansion would result in interference with the data transmission of other access points or clients.”
I thought the low speed was caused by router settings. My router had channel 4 set as primary, which left only 4+4=8 as secondary. I thought some interference on channel 8 was preventing it to be used, so I changed the primary to 5, with 1 and 9 now as options for secondary. But that didn’t increase the connection speed.
Today, after digging more, I looked on laptop at adapter’s settings. There, the Channel Width on 2.4GHz range was set to “20 MHz Only”. I set it to Auto, let the laptop reconnect, and voila! Now the speed increased, reaching values in the 270-300Mhz range, as it should have.
It didn’t make sense, why won’t be this set to Auto by default? Then I remembered. It was.
3 years ago I was experiencing frequent connection drops, and a lot of reconnecting – I was not able to maintain a RemoteDesktop connection to work without the laptop pausing for reconnect every couple of minutes. It was really annoying. And it was me who limited the channel width to 20MHz, which seemed to reduce the number of connection drops.
Well, now I have a second Wi-Fi router to extend the range, and the laptop’s connection at 300Mbps seems more reliable now. So I guess I’ll keep the laptop’s channel width back to its default settings.
Unfortunately the Surface RT’s network adaptor doesn’t have a similar setting, so the tablet will have to connect to 150Mbps max. No loss there until Comcast will allow such speeds at reasonable prices.