ok, I patched those links in. Forgive my code posting, last time. I will
answer offlist if I get any further requests.

It has some command line options to specify the size to download, where to
put the logfile, and the duration to wait before downloads.

$> python x_download_test.py 1M -i 5

-Python-----------------

# /usr/bin/python

import time, optparse, urllib2, csv


if __name__ == "__main__":

    dlurls = {"1M"   : "http://mirror.internode.on.net/pub/test/1meg.test";,
              "10M"  : "http://mirror.internode.on.net/pub/test/10meg.test";,
              "50M"  : "http://mirror.internode.on.net/pub/test/50meg.test";,
              "100M" : "http://mirror.internode.on.net/pub/test/100meg.test
",
              "1G"   : "http://mirror.internode.on.net/pub/test/1000meg.test
",
              "5G"   : "http://mirror.internode.on.net/pub/test/5000meg.test
"
             }

    print("Network Download speed logger. Freeware Licence")

    usage = "usage: %prog [options] arg1 arg2"
    parser = optparse.OptionParser(usage=usage)
    parser.add_option("-i", "--interval", action="store", type="float",
dest="interval", default=10, help="Interval in minutes between downloads")
    parser.add_option("-l", "--logfile", action="store",
dest="logfilename", default="download_times.csv", help="Interval in minutes
between downloads")

    (options, args) = parser.parse_args()

    download_size = "10M"
    if len(args) > 0:
        download_size = args[0]

    download_interval = options.interval * 60
    download_url = dlurls[download_size]

    # setup a logfile
    f = open(options.logfilename, 'a')
    writer = csv.writer(f)

    while (1):

        print("Downloading %s" % download_size)

        # Initial Time reading
        start = time.clock()

        mp3file = urllib2.urlopen(download_url)
        mp3file.read()

        elapsed = time.clock() - start

        writer.writerow([time.strftime("%c"),download_size,elapsed,])

        print("Pausing for %f minute(s)" % int(download_interval/60))

        time.sleep(download_interval) # Time in seconds.



On Thu, May 22, 2014 at 1:40 PM, Rick Welykochy <r...@vitendo.ca> wrote:

> David wrote:
>
>> On 22/05/14 08:38, Rick Welykochy wrote:
>>
>>> Edwin Humphries (text) wrote:
>>>
>>>> Can anyone suggest a way of testing the download speed of my NBN fibre
>>>> connection every hour and logging it? I have an ostensibly 100Mbps
>>>> connection, but the speed seems to vary enormously, so an automated process
>>>> would be good.
>>>>
>>>
>>> Download a file of known length, say 1000 MB, from a server
>>> whose speed you can trust every hour. Time and log each download.
>>> Also verify the contents of the downloaded file with an md5 or sha
>>> digest.
>>>
>>> This can be automated with an scp inside a simple (shell) script.
>>>
>>>
>> Westnet used to have a file available for exactly this purpose - I dare
>> say other ISP's do too. Perhaps you could ask your own ISP.
>>
> This looks promising:
>
> http://mirror.internode.on.net/pub/test/
>
> I found this via a web search for "test download file residing on an isp
> australia".
>
>
> cheers
> rickw
>
>
> --
> ------------------------------------
> Rick Welykochy || Vitendo Consulting
>
> If consumers even know there's a DRM, what it is, and how it works, we've
> already failed.
>     -- Peter Lee, Disney Executive
>
>
> --
> SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
> Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
>
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to