Hi Ivanb,
I'm facing the same problem.

I'm trying to use Tornado as the web serivce, providing a page with a 
submit area, and pass the url users entered to scrapy, then run my spider, 
load the items to MongoDB. At the same time, the web page will keep getting 
the items from the MongoDB so users won't have to wait until the spider's 
done.

Googled a lot, and I'm new to Python.
Having a hard time.

Hope you would solve this ahead of me, and desperate for a solution.

Thanks. 


On Tuesday, April 3, 2012 at 4:05:39 PM UTC+8, ivanb wrote:
>
>      I made couple of scrapers, some basic stuff, and I'm looking now 
> into creating some UI for those scrapers. 
> So they can be run from that UI by just clicking. So, I would like if 
> you could give me some guidelines where to look at, and if there is 
> some example of something similar. I guess the first and main step 
> would be running scrapy from the script, and then rest of it wouldn't 
> be problem. 
>
>       I'm looking for some advice in what way to accomplish running 
> scraper from the script, I guess someone surely did something like 
> that before, so I can see what can be done with that. Please share 
> some advice for this. 
>
>       Thanks

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to