Firefox, SPDY, and BSecure Cloud Care

I encountered a strange issue the other day where a client couldn’t upload attachments to Gmail using Firefox. It worked fine in IE8 (Windows XP), and the rest of Gmail seemed to work, just attachments were weird. I disabled spdy in about:config and uploading suddenly worked flawlessly. In tracking down the issue, I noticed that she had the BSecure Cloud Care content filter installed. On a hunch, I used another laptop to test attachment uploading in Firefox (it worked fine), then installed BSecure and tried again. It failed the second time until I disabled spdy.

As a sidenote, once I added mail.google.com to the whitelist in Cloud Care on the second computer, uploading worked fine with spdy enabled. It already was in the whitelist on the first computer, so I’m not sure what is occurring there. My guess is that some other element was blocked at some point.

The workaround was to simply disable spdy on the first computer. Not a great solution given the benefits that spdy provides, but the best one for that situation.

I didn’t see any information regarding this issue on the internet, so I am posting this for posterity.

IE Bug with Comments and Object Tag

For future reference, and to hopefully keep others from having hours of frustration.

<=IE8 cannot handle HTML comments inside the param section of an object tag. So if you have:

1
2
3
4
5
<object>
<param />
<!--<param />-->
<param />
</object>

The entire section will fail.

Beginnings of HTML5 Project Starfighter

I started rewriting Project Starfighter using Javascript and the HTML5 Canvas.

A lot has been completed, but much remains. I think the basic structure of the ships and weapons is done as well as gameplay logic, although some tweaking remains.

The primary reason I am posting this is because I don’t know if I will ever finish this due to far less time being available for projects like this. I hope that someone will run across this and decide to finish it. The code is licensed with the AGPL 3.0. Basically, I don’t care what you do with it as long as you release the code. And please comment below and link to it if you do finish this!

Go to GitHub to download the code and continue development!

Simple game

Here is a game I wrote a while ago. I don’t think the concept was mine, but I rewrote it in Javascript…

The goal is to make the entire grid one color is 22 plays or fewer. To do this, you select a color from the bottom row at which point all contiguous squares from the top left change color.

Click Easy to begin.

Network Neutrality 2

After rereading the previous post, I realized that I was addressing the previous comments on that Slashdot thread. So here is a better statement of my ideas on the network neutrality question.

The issue that sparked (in tech circles at least) the network neutrality debate was ISP throttling of Bittorrent. Bittorrent is a peer-to-peer file sharing protocol which works by breaking a file into parts and allowing people to download from other people who have already have downloaded that part of the file. One of the big advantages is that while a server slows down when tons of people download a file, Bittorrent actually speeds up. However, it causes a consumer with a home connection to use far more bandwidth than would normally be used.

Most consumer internet connections are designed around a very specific use case: lots of small files with lots of time between bursts (normal browsing) and few larger files (downloads). With this model, ISPs could sell more connections than they could handle under a full load. 10 users would be sold 15Mb connections when there is one 45Mb connection to back it up. This works because not everyone would be running sustained downloads simultaneously.

However, recently internet usage has changed so that everyone would be using a full connection simultaneously. The backend connections are being saturated and ISPs now have to add additional capacity to handle it. The single biggest factor in this is streaming video. Between YouTube offering 720p and 1080p videos, Netflix switching to a mostly streaming movie service, and Hulu offering TV shows, the average home bandwidth usage has gone up tremendously. The Bittorrent issue that used to normally affect only techs, suddenly became mainstream.

High level ISPs handle a tremendous amount of traffic and usually have peering agreements which allow them to transfer between each other for free. So Google and Comcast probably don’t pay each other anything for the traffic. Comcast and a company called Level 3 Communications had such an agreement, but Level 3 started using a lot more bandwidth and Comcast started charging fees. Level 3 claims that they have an agreement that the traffic doesn’t cost anything, and Comcast claims that they have to put more infrastructure in to handle the said traffic. It makes sense until you ask why Level 3 suddenly has so much traffic. They recently signed a contract with Netflix to act as their ISP.

Bittorrent, Hulu, Youtube, and Netflix happen to compete with TV and On Demand services that many ISPs such as Comcast also offer. It is obviously in Comcast’s interest to keep people from being able to access services like Netflix quickly and inexpensively. However, Netflix depends on the internet connection which Comcast provides. Suddenly, the “fees” are seen in a different light. If Level 3 is charged more for Netflix traffic, they have to charge Netflix more, Netflix charges consumers more, Comcast wins.

The fundamental question is one of infrastructure. As a civilization, we use certain things in common which become very important to our daily lives. Roads, the airwaves, train tracks, etc… This infrastructure is beginning to include the internet. Infrastructure usually requires a large capital investment to create, and many other companies depend on open access to thrive. When infrastructure is controlled by the same company that provides products over that infrastructure, competition and innovation will frequently suffer. Given this, how is the openness of the infrastructure maintained?

This is network neutrality in a nutshell. The idea that an ISP should not be able to charge more for, block, or throttle competing traffic. I agree with the idea in essence, if Comcast et cetera had blocked all competition and tried to do their own, we would probably not have services such as Hulu, Netflix, YouTube, Google, Facebook, and many other services that we use daily.

The question is how to enforce this openness. Or whether it even needs enforcing. It must be noted that all of the above mentioned services have appeared under the existing system. It is possible that the streaming video question will work itself out likewise. But if it did not, should we try to regulate openness?

The current network neutrality debate is over the FCC’s ability to regulate this aspect of the internet. Their legal right is questionable at best, so allowing them to simply assume this authority is setting a bad precedent. If the FCC should be regulating this, congress should explicitly authorize them to do so.

The wider debate is whether the government should have any role in regulating ISPs whatsoever. While most people can switch ISPs if their ISP starts blocking or throttling a service they want, it is quite difficult to switch governments or to get a government to relinquish a power once taken. Also, the government does not have a great track record with keeping pace with innovation themselves. Allowing the FCC or other agency to regulate the internet may prove to be a chilling effect on entrepreneurs.

There is also the potential, as unlikely as it would seem, that the government could use regulatory control to limit free speech. There are fringe groups on the right and left which would like to use network neutrality to be able to block speech they disagree with. Fortunately the mainstream of both political parties appear unlikely to sanction such a usage of power, but there is precedent in history for unusual power being wielded in time of war or other national crisis.

I personally think that allowing the industry to self regulate is the lesser of two evils.

Laptop Buying Guide

I wrote this for a friend and then decided to post it here. If you would like me to expand on something or clarify, please comment.

The top laptop brand, in my opinion, is Apple. Apple hardware is more reliable and better designed, as is their software. This comes at a price. It is up to you to decide whether it is worth it. I have an iBook G3 from 2001. I still use it to watch DVDs, and the battery still lasts through an entire film.

However, if we limit the discussion to PCs, it is probably best to start with the hardware in question, as it is this that informs our comparison of brands.

To jump to features – generally, when we are comparing laptops, we are concerned with the hardware. The software is usually the same, or can be made so. There are several factors, the relative importance of which, depends on your use case.

The hard drive

Two factors here: space and speed. Space, obviously, is the measure of how much data it will contain. If you are storing a ton of photos, movies, or anything else which has a large unit size, the space can be a very important factor. Standard today is around 500 GB, large is 750GB or even 1TB (1024GB, see below), and small is usually about 256GB.

Sidenote regarding space units. One gigabyte is equivalent to 1024 megabytes. One megabyte is equivalent to 1024 kilobytes, etc… The reason for the usage of 1024 rather than 1000 comes down to how a computer stores and computes data. The usage of binary means that data size will generally be expressed in binary units, e.g. 210. However, hard drive manufacturers will express hard drive space in terms of 1000, the result being that you can buy a 500GB hard drive and it will show up in Windows as a 456.7GB hard drive. Quite silly. Macs solve the problem by also computing space in terms of 1000, for better or worse.

The other factor with hard drives is speed, usually expressed in RPMs (revolutions per minute). Hard drives have platters that spin and are read by a sensor on a stick called a read head. The faster they spin, the faster you can read data, but the more power they use. So a higher rpm rating means better speed and less battery life. Laptops usually come with a 5400 RPM hard drive, but I usually try to find a 7200 RPM hard drive, as it helps performance. High end hard drives can be 10,000 to 15,000 RPMs, while low end or low power hard drives can be 4800 RPMs. The speed will help will boot time, starting programs, and copying or moving data.

There is a newer storage tech called a SSD (solid state storage). This is essentially a really big, really fast flash drive, like you find in a camera. SSD performance can vary widely, from extremely slow, to faster than most hard drives. They have very good battery life, as there are no moving parts. Speed is usually measured in MBs per second – that is, how much data can be transferred or written in a second. This measurement is also useful to compare SSDs with traditional hard drives, although random vs. sequential data access becomes an issue (see below). The biggest problem with SSDs is the cost per space is very poor. I can get a 60 GB SSD for the cost of a 500 GB hard drive.

Sidenote regarding random vs. sequential data access. When a file is read from a hard drive, the speed in reading it is very dependent on how the data is spread out over the platter. If all of the data is in a row, the read head only has to spin the platter once, and it has read all of it. If the data is spread out in a bunch of different places on the platter, the data is slower to read, as the read head has to move all over the place. This is a random read. SSDs don’t have this problem, as a random read is just as fast as a sequential read. Thus, one program may make a hard drive seem faster than a SSD, while the same hardware can yield the opposite result with a different program/workload. So much for hard drives.

Memory

Again, there are two factors, size and speed. However, this is simpler as you will only notice the difference between memory speeds in specialized workloads. So we only have to worry about memory size. Memory size determines the number of programs you can reasonably run at the same time, the amount of data you can handle in any given program, and how fast your hard drive appears. Memory is known as volatile storage. It is much faster than a hard drive or SSD, however when you unplug your computer, everything on it is lost. It requires power to store the data. Files and programs that are actively used are pulled from the hard drive to memory and the written back to the hard drive when you are done. Memory can also be used to speed up hard drive reads by caching frequently used files, and hard drive writes by holding the file in memory and writing the the hard drive in the background while the program thinks it has already been written. (this is, incidentally, why you can lose data if you unplug your computer without shutting down, or pull put a USB key without safely removing it.) This goes the other way as well: if a computer is running out of memory, it will start grabbing sections that have not been accessed in a while and write them to “virtual memory” on the hard drive to free up that section of memory for something else. This is called swapping and slows down your computer astronomically. A standard size for memory is about 4G, although 6 and 8G computers are available now. You can get 1, 2, and 3 G computers, but these cannot do as much. Memory is probably the easiest part on a laptop to upgrade, and replacements are readily available.

CPU

This is really complicated and changes frequently. There are several primary factors and host of smaller ones. The big factors are: number of cores, clock speed (gigahertz), cache, word size (e.g. 32 or 64 bits), and power usage.

I would try to have at least two cores. This increases system stability because one rogue process cannot take down the system. It also increases speed when multitasking. If you can get three or four cores, still better.

Clock speed is less of an issue than formerly. It makes a difference, but most people won’t notice compared to other factors. Try to not go below 2.5Ghz.

Always get a 64 bit processor. 32 bit is really old and is being phased out.

Power usage requires some research and can be inversely proportional to speed. However, some processors are better this way than others.

If you are looking for brands, there are two main ones: Intel and AMD. I am going to detail Intel as I am not very familiar with AMD’s models anymore. In Intel-land, I suggest going for a Core i3, i5, or i7. Pentiums are decent, but outmoded. Celerons shouldn’t be touched, they are ancient and slow. The Atom processor is slow, but uses very little power, and you will usually see in netbooks.

Screen

The two factors are size and resolution. Laptop screens are normally 15″ (measured diagonally), 17″, or 14″. You can also get smaller screens, but they tend to be more specific purpose. This is a very personal decision. You have to evaluate ease of reading, size, and weight to decide what you want. (Sidenote, most MacBook Pros are sold 13″. For some strange reason, they are closer in usefulness to a 15″ PC screen. They also come in 15″ and 17″ varieties.)

Laptop resolution determines how many pixels are on the screen. This means how much stuff you can see at the same time, and how big the text is. Again, a personal decision.

Case

This is one of the biggest, yet most overlooked factors. The weight and solidity of the case is a huge determinate of how long you will use the laptop, and how much you will like it. Apple is king here, but there are some other good systems.

Peripherals

What ports are available? Where are the ports located? It is helpful to have at least one USB port on each side, and hopefully more. Other ports to keep in mind are: camera card readers, HDMI, VGA, eSATA, and audio. This list is not exhaustive.

Battery

The number of cells determine how long the battery will last on a charge. This is usually pretty well defined. Another, far harder to determine, factor is the number of recharge cycles, or how long will the lifetime of the battery be. Apple tends to be very good here as well.

Conclusion

Once all of these have been evaluated, I really like (in descending order): Apple, Lenovo, Acer, and HP. There are other brands of better and worse quality, and different laptops within a brand will be of better or worse quality. Hopefully, this will help you make a good decision.

Content Filters

Every so often, I get asked for a recommendation for a content filter — usually accompanied by the adjectives perfect or best. My answer is that the best content filter available is called Parental Supervision. It is free and easy to install, but difficult to maintain…

As for actual software, I haven’t dealt with any that I haven’t been able to bypass. Realize that on occasion it requires a Knoppix bootdisk, but still, don’t expect to find a piece of software that you can install on your child’s laptop and be certain that he/she will be unable to access anything bad. Nothing surpasses parental training and supervision. Keeping your computer in a public place is one of the first and most effective steps you can take.

With that said, you probably want some software to keep your child from accidentally stumbling upon something bad in the course of normal browsing. There are two main types of software for this:

The first is a standard blacklist. This is the most common and simplest method of implementing a content filter. The software simply contains a list of sites and pages that contain bad content. Obviously, the weakness of this approach is that it is impossible for anyone to keep up with all of the filth that is added hourly. The second method is to actually analyze the content as it enters the network or computer. This is much harder to do well, but it blocks far more content with the downside of more false positives (i.e. good sites that are blocked). This is an acceptable tradeoff for most parents.

A simple solution that I recommend everyone do in addition to any other filter they may use is OpenDNS. OpenDNS is a site level filter (amongst other things) that is relatively easy to implement and maintain. It is also free for the standard package. You can implement this at the network level on your router, or on the computer level by changing some settings. The instructions on the website are quite good. OpenDNS uses a site level blacklist, so it is not infallible, but it is pretty good. OpenDNS is also relatively easy to bypass, either by changing the DNS settings on the local computer to point to another DNS provider (although this can be blocked if you have a more advanced router), or by simply connecting to another network (if you have implemented OpenDNS in your router).

The solution I personally use requires a dedicated Linux server and a knowledge of the Linux command line, thus placing it outside the realm of usefulness for most parents. But in case you are still interested, I use DansGuardian. The advantage of this solution – besides being free – is that it analyses the content before it reaches the browser on a per page basis. It is very powerful, and would be useful for a middle to larger sized school or organization.

There are a ton of filter packages, about which I know little. Please leave any observations or experiences in the comments!