testing with a mock object
i have a questions about mock objects. i currently have a django view function that calls a 2nd function. this second function calls urllib2.urlopen. i was thinking about adding in a mock object so i can get some better code coverage in the 2nd function when calling manage.py test. my first thought was to create a mock urllib2.urlopen function, but how do i inject it into the code from a django unit test? is there a best practice way to do this? def foo_view(request): ... foo_func(...): ... return HttpResponse('response') def foo_func(...): ... response = urllib2.urlopen(url) tree = etree.parse(response) try: if tree.xpath('/exception'): raise FooError() else: result_attribute = tree.xpath('/success/@n-success')[0] == '1' except IndexError: result_attribute = False return result_attribute import unittest import os from django.test.client import Client class TestFoo(unittest.TestCase): def setUp(self): self.client = Client() def test_foo(self): # how do i inject a mock urllib2.urlopen into foo_func? thanks, bryan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to django-users+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
Re: touching wsgi file
thanks for your detailed explanation. i had to read it 3 times to try to absorb everything :) i do not need to have the code be portable beyond mod_wsgi. if i'm following what you and steve have said and pages you both pointed me to, it appears if i want to share global data i really need to use daemon mode with single process and multiple threads. to restart django under mod_wsgi using daemon mode, i just need to touch the wsgi file. actually, i currently have code that uses os.utime, but i didn't know about the SCRIPT_FILE environment variable, so that's a really good tip. i was using a hard-coded path in settings.py. i don't think it worked for me because i don't think i'm in daemon mode. i don't have the script in front of me at this time, so i can't share it with you now. i'm a little unclear if you said that touching the wsgi file works only in wsgi daemon/multi- process mode or in wsgi daemon regardless of multiprocess/ multitheaded mode. having the child processes restart on the next request is not a problem for me, just as long at every mod-wsgi daemon process eventually gets restarted on it's next request. bryan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
Re: sharing a module level variable
On Nov 2, 2:58 pm, Graham Dumpleton <[EMAIL PROTECTED]> wrote: > On Nov 3, 2:03 am, belred <[EMAIL PROTECTED]> wrote: > > > > > On Nov 2, 3:29 am, Steve Holden <[EMAIL PROTECTED]> wrote: > > > > belred wrote: > > > > i have an wget event in a cron job. the view does some processing > > > > which involves calling external websites and databases (takes about 25 > > > > seconds) and updates some module level dictionary lookup variables > > > > (about 7 MB of data) which the rest of the program reads. but > > > > unfortunately, only the one apache child process seems to be updated, > > > > not all the others. how can i share this module level variable across > > > > processes? this lookup happens multiple times per request. but the > > > > internal data gets updated nightly. > > > > Aah, so this is the real question! I don't believe Django signals are > > > intended to serve needs like this. How does "the rest of the program" > > > read the module-level dictionary items? The more explicit you are about > > > the exact communication mechanism involved the more likely it is someone > > > (probably not me) will be able to help. > > > > regards > > > Steve > > > thanks steve, > > > you comments about signals were very clear and should be in the > > documentation. > > > when i receive the nightly wget request to update the data, it's > > stored in module level dicts which other parts of the program use by > > simply importing the data module. > > > import data > > > def view_func(request): > > foo = data.lookup['foo'] > > ... > > > def view_update_data(request): > > data.lookup = retrieve_new_data() > > ... > > > this works correctly for the child process that receives the update, > > but the other child process don't seem to be getting the updates. > > > i thought after i did an update i could os.uptime() thewsgifile > > which would cause all of my django project to restart, but it doesn't > > seem to be updating my data either. > > If the module level data you are talking about are just transient > global variables in the module, then other processes can't see them > (or other sub interpreters in same process). For a start read: > > http://code.google.com/p/modwsgi/wiki/ProcessesAndThreading > > Written for mod_wsgi, but equally applicable to mod_python for > embedded mode description at least. Take note of very last section > where it summarises data sharing issues. > > If it is module level data, only solution would be to use mod_wsgi > daemon mode and delegate Django instance to run in single process. > > In respect of triggering a restart, if the data is indeed transient, > ie., in memory only, how does it get read on startup if you are > relying on cron job to trigger its initialisation? > > Graham in the apps __init__.py it sees if a pickle file exists. if so, it read it in and sets the module level variable. if it doesn't exist, it call on web services to get new data and process it, setting the module level variable and saving a pickle. saving the pickle allows the external server to be down as well as my server to restart and still use the "last known good data". it also allows my server to start instantly instead of 25 second delay while retrieving new data. nightly, cron uses wget to call this routine. --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
Re: touching wsgi file
i am using linux and i am using mod_wsgi to run django. i don't understand this sentence. "This should only be done for daemon processes and not within the Apache child processes"... we have many apache child process. if i follow this code: if environ['mod_wsgi.process_group'] != '': import signal, os os.kill(os.getpid(), signal.SIGINT) will it restart django in the one child process the code executes in or all child processes for the given django project? (i'm trying to find a way to restart the entire django project regardless of how many child processes are running). also doesn't the mod_wsgi daemon process exist within an apache child process? if so, that above sentence doesn't make sense to me. there must be something i'm missing or don't understand because i don't see documentation or people asking questions about properly running django on high volume sites under apache child processes in linux. maybe this is not the recommended way it's done. On Nov 2, 2:17 pm, Graham Dumpleton <[EMAIL PROTECTED]> wrote: > On Nov 3, 7:09 am, belred <[EMAIL PROTECTED]> wrote: > > > is touchingt the wsgi file supposed reload every file in the the > > entire django project or just just reload the one wsgi file? > > If you are talking about the WSGI script file when using Apache/ > mod_wsgi, it depends on how you have Apache/mod_wsgi configured. See: > > http://code.google.com/p/modwsgi/wiki/ReloadingSourceCode > > In other words, for whole Django instance to be reloaded you must be > using UNIX and have configured mod_wsgi daemon mode and delegated the > Django instance to run in the daemon mode process group. > > Graham --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
touching wsgi file
is touchingt the wsgi file supposed reload every file in the the entire django project or just just reload the one wsgi file? thanks, bryan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
Re: sharing a module level variable
On Nov 2, 3:29 am, Steve Holden <[EMAIL PROTECTED]> wrote: > belred wrote: > > i have an wget event in a cron job. the view does some processing > > which involves calling external websites and databases (takes about 25 > > seconds) and updates some module level dictionary lookup variables > > (about 7 MB of data) which the rest of the program reads. but > > unfortunately, only the one apache child process seems to be updated, > > not all the others. how can i share this module level variable across > > processes? this lookup happens multiple times per request. but the > > internal data gets updated nightly. > > Aah, so this is the real question! I don't believe Django signals are > intended to serve needs like this. How does "the rest of the program" > read the module-level dictionary items? The more explicit you are about > the exact communication mechanism involved the more likely it is someone > (probably not me) will be able to help. > > regards > Steve thanks steve, you comments about signals were very clear and should be in the documentation. when i receive the nightly wget request to update the data, it's stored in module level dicts which other parts of the program use by simply importing the data module. import data def view_func(request): foo = data.lookup['foo'] ... def view_update_data(request): data.lookup = retrieve_new_data() ... this works correctly for the child process that receives the update, but the other child process don't seem to be getting the updates. i thought after i did an update i could os.uptime() the wsgi file which would cause all of my django project to restart, but it doesn't seem to be updating my data either. bryan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
sharing a module level variable
i have an wget event in a cron job. the view does some processing which involves calling external websites and databases (takes about 25 seconds) and updates some module level dictionary lookup variables (about 7 MB of data) which the rest of the program reads. but unfortunately, only the one apache child process seems to be updated, not all the others. how can i share this module level variable across processes? this lookup happens multiple times per request. but the internal data gets updated nightly. thanks, bryan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---
django signals
i'm having touble finding documention about django signals that explain how they behave under apache child processes. if i have a signal go off and have a listener for that signal. will only the listener in same child process receive it or will that same listener in all apache child processes receive it? what is the best way to signal all of your apache child process for you project? thanks, bryan --~--~-~--~~~---~--~~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~--~~~~--~~--~--~---