An assessment of local web page load times

The ability to effectively use technology such as Smart boards in our classrooms depends directly on the responsiveness of the network serving those technologies. To borrow a Sun Systems slogan, "the network is the computer." While some faculty rely heavily on PowerPoint presentations which being locally stored have little load time latency, the real power of present day learning technologies is only unleashed when they are connected to network and Internet resources.

The page load time data further below is the duration of time in seconds for web pages on the college server to completely load on my desktop in the south faculty building. The desktop used, although the oldest in the division, is an Windows XP machine with two gigabytes of RAM. This combination yields a very responsive and fast desktop. Load speeds are not significantly affected by the desktop hardware.

The desktop is connected via a 100 Mbps connection to a 10/100 hub. I am uncertain of the specifics beyond my local hub, but I gather that hub connects at 100 Mbps to a building hub or switch, which then connects via fiber to the administration building at gigabit speeds. None of the pages accessed involved connecting to resources from beyond the local area network and the Palikir server.

Making sense of the duration data: web page size information

All of the pages loaded are in a single folder on the server ( http://www.comfsm.fm/~dleeling/physci/text/ ). The pages are those that comprise my physical science text. Although there are 51 pages, times were obtained for only 39 page load attempts. My primary task was the printing of the second edition of the text for the book store. There were instances when I did not capture the load duration data. I was loading and then printing the sections. In FireFox 3.6, a print job has to be completed prior to loading a new page.

The web pages are both html5 and xhtml5 web pages. The xhtml5 utilize MathMl and SVG to present equations and diagrams. This greatly reduces the number of image files required in the text. Only a couple of pages make extensive use of images, laboratory seven and eleven. In both cases, the browser is likely to cache the image on a first load and thus subsequent loads will be faster.

All style information for the pages is contained in one of two css files, these are also cached by FireFox and thus are not necessarily loaded on the wire with each page request.

Page size statistics



Statistic
Size (bytes)
Count
51
Min
1806
Max
35020
Mode
None
Median
8552
Mean
9950.94
StDev
7687.73
Coef Variation
0.77

The mean page size is 9950 bytes, with half of the pages less than the median of 8552 bytes. The largest page is 35020 bytes. None of these statistics includes the attached image files. Once the html or xhtml loaded, the images generally loaded immediately from the cache. These are pages I access with frequency. In some cases I was making minor edits to the pages using secure copy protocol (WinSCP) and then reloading the page in order to print the page.

For reference, these page sizes are small in terms of modern Internet pages. "From 2003 to 2008 the average web page grew from 93.7K to over 312K" (WebSiteOptimization.com) That puts my largest page at less than half of the 2003 average size, and my mean page size at a tenth of the average web page size. These physical science pages are simply tiny in terms of size. If there is difficulty in loading these small and efficiently coded pages, then there is little hope of loading the average modern web page in 2009.

In other words, these pages should load far faster than random pages pulled from the server to a desktop with an empty cache. These pages are also far smaller than the typical Internet page found "in the wild." The duration data reported below is effectively the fastest possible times one can hope to obtain from the college server.

Duration data

The following table records the load times for 39 html and xhtml files starting at 2:00 P.M. on Monday 19 October 2009. Thus these load times reflect "business day" data. Time were measured using my digital watch. No specialized equipment was used, but then as the data shows, load time durations are not in the sub-second range where millisecond precision would be useful.


Time
14:00
Load time (s)
Comments
5

6

12

7

12

30
failed: time out
28

30

10

12

37
progressive loading
30

12

9

7

35
progressive loading
13

5

13

1
less than one
3

5

Time
15:30
27

30
failed: time out
14

21

5

8

Time
15:35
30
failed: time out
30
failed: time out
30
css failed to load
13

30
failed: time out
22

36
progressive loading
11

10

9

10


Progressive loading means that the page began to load prior to the browser's 30 second time-out limit. If the browser receives no data from the server within 30 seconds, then the browser indicates failure of the page to load and terminates the page request.

Duration statistics


Statistic
Value (seconds)
Count
39
Min
1
Max
37
Mode
30
Median
12
Mean
16.87
StDev
11
Coef Variation
0.65


Lower 95% ci
13.31
Upper 95% ci
20.44

The mean load time for these pages was 17 seconds, with half of the pages taking 12 seconds or more to load. The mean page load time for similar pages under similar network conditions will be between 13 and 21 seconds. At first glance, this might not seem that long. Yet in the classroom this represents a very long pause.

On the weekends pages such as those above load in under a second, virtually instantaneously. While ideal, this may not necessarily be realistic during the business day at the college. I think it was Mary Wilgocki, formerly of REI and subsequently a member of the college Title III team as the IT specialist, who noted that at REI the goal was response times to telephone order salesperson's desktops of under three seconds. I may have that number too high, it might have been less, but certainly not more. The reason was business driven - a wait time of over three seconds gave the customer on the phone time to reconsider their purchase. REI lost orders when wait times exceeded something on the order of a couple seconds. Pause for moment and count one-one thousand, two one-thousand, three one-thousand. Time enough to think, or for a student to become distracted.

For some hard numbers, the WM100 is a performance benchmark based on the average page load times for 100 of the most visited sites on the Internet, as reported by Alexa ( WebMetrics.com accessed on 25 October 2009, data updated hourly ). The median load time for the top 100 is 4.4 seconds, with number three Google loading html in an average of 0.45 seconds and the full page in 0.73 seconds. Clearly local network load times should be well under these Internet values.

In my teaching methods courses I was taught to count off six seconds after asking a question in class. Six seconds is a long pause, but many instructors jump into the perceived uncomfortable silence long before six seconds.

Thus having pages take 12 or more seconds to load is an intensely long period of time in a classroom situation. To have a page time out after 30 seconds almost completely disrupts the flow of a classroom presentation.

For students in courses such as mine where academic support materials are all available on line, the long load times impede their ability to effectively and efficiently study by using the material I have placed on the college servers.

Discussion

The page load times for locally stored and served pages is entirely within the ability of the college to remedy. This does not involve connectivity to the Internet. In fact, any expectation of improved bandwidth off-island via the fiber optic cable arriving this December on Pohnpei has to be tempered by the reality that our own core systems are slow to deliver local pages even on high speed LANs. I cannot imagine what the load times would look like from WAN locations beyond the Palikir campus.

There is a tendency to blame social media for the lack of bandwidth. This leads to solutions such as throttling back social media sites during the business day. While I do not disagree with this approach, it is rather like solving the problem of children getting infected cuts on their feet by restricting them to padded bedrooms and not letting them play outside. Solving this issue by trying to find new restrictions to attempt to more effectively use the college's all too limited bandwidth is a band aide approach at best. Given that this is being done already, the data above indicates how insufficient this approach is at present.

The most explosive area of growth in education is in on line videos, often via YouTube and other video sharing sights. These sites often double as social media sites. Throttling back these sites would also limit access to educationally useful material. There is no way our current network could begin to deliver the series of ocean science streaming videos now embedded in version five of Google Earth. The issue of social media site access and throttling is dwarfed by demands streaming video would place on the system.

Yet streaming video, including live feeds such as Skype, are likely to be important to distance education efforts here at the college.

Conclusion

This data is meant in good faith and to support IT in funding requests to upgrade the network core at the college. Servers and networks can deliver pages to sites such as ours in under a second. There are thousands of vastly larger enterprises on the planet that deliver pages to more users. This is actually not rocket science - this is simply an equipment issue. This is something that money can solve.

Popular posts from this blog

Box and whisker plots in Google Sheets

Traditional food dishes of Micronesia

Creating histograms with Google Sheets