I didnt use auto commit but seems to work fine without it in development 
server.
There are some rollbacks in scrapyd code too so i dont know if auto commit 
is good.....
Why auto commit is important?

Thanks in advance.

On Tuesday, December 27, 2016 at 11:00:50 PM UTC+2, Nikolaos-Digenis 
Karagiannis wrote:
>
> 1. Yes, just keep the connection in auto commit.
> Also, know that you will have to ensure your pg server's availability.
> 2. Leftovers. Shouldn't matter.
>
> On 27 December 2016 at 17:59, k bez <[email protected] <javascript:>> 
> wrote:
>
>> Ok I patched *sqlite.py* module and SqlitePriorityQueue class and it 
>> seems to work fine with Postgre.
>> But i have some questions if smone can answer.
>> 1. SQLite use parameter "check_same_thread=False" for connection. But I 
>> think Postgre con is thread-safe so i didnt use smthing like this.
>> 2. Whats is the use of remove and clear methods of SqlitePriorityQueue 
>> class? Queue worked fine before patch them as i couldnt find a call for 
>> them. 
>>
>> On Monday, December 26, 2016 at 10:45:25 AM UTC+2, Nikolaos-Digenis 
>> Karagiannis wrote:
>>>
>>> Hi,
>>>
>>> You probably looked into some old documentation.
>>> This setting was introduced before scrapy separated from scrapyd.
>>> Job queues are now implemented in scrapyd.
>>> A pickaxe search confirms this:
>>>
>>> https://github.com/scrapy/scrapy/commit/75e2c3eb338ea03e487907fa8c99bb12317e9435
>>> This was a point were many release notes are missing,
>>> notes clarifying what was removed from scrapy,
>>> because the "separation of scrapyd" alone doesn't say much.
>>>
>>> Do you use scrapyd?
>>> Unfortunately, the job queue class is no longer configurable.
>>> It shouldn't be hard however to patch scrapyd,
>>> either to make it configurable or to use your own fork.
>>>
>>> Check out scrapyd's repository https://github.com/scrapy/scrapyd/
>>> and if you come up with something
>>> don't hesitate to open a PR or an issue with suggestions.
>>> We'll be glad to help,
>>> scrapyd does need its components to become less tightly coupled
>>> and making the job queue configurable can contribute to this.
>>>
>>>
>>>
>>> On Monday, 26 December 2016 01:49:50 UTC+2, k bez wrote:
>>>>
>>>> I have started to implement a custom job queue so i ll move from 
>>>> default SQLite to Postgre.
>>>> I use this setting SPIDER_QUEUE_CLASS = 
>>>> 'mysite.scraper.PostgreSQLQueue' but scrapyd seems to ignore it and 
>>>> create/use default SQLite DB.
>>>> I want to ask if SPIDER_QUEUE_CLASS dont work anymore?
>>>> Thanks in advance.
>>>>
>>> -- 
>> You received this message because you are subscribed to a topic in the 
>> Google Groups "scrapy-users" group.
>> To unsubscribe from this topic, visit 
>> https://groups.google.com/d/topic/scrapy-users/V8vshXijC5c/unsubscribe.
>> To unsubscribe from this group and all its topics, send an email to 
>> [email protected] <javascript:>.
>> To post to this group, send email to [email protected] 
>> <javascript:>.
>> Visit this group at https://groups.google.com/group/scrapy-users.
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to