I recently published a little scrapy extension that I have been using for a 
little while on github.  The project allows you to define scrapy items 
using sqlalchemy and provides a convenient way to save those items to a db.

https://github.com/ryancerf/scrapy-sqlitem.  I would appreciate feedback 
and I hope someone finds it useful.


It allows you to write scrapy Items using SqlAlchemy Models or Tables. 
 (Just like DjangoItem)

from scrapy_sqlitem import SqlItem
class MyModel(Base):
    __tablename__ = 'mytable'
    id = Column(Integer, primary_key=True)
    name = Column(String)
class MyItem(SqlItem):
    sqlmodel = MyModel

class MyItem(SqlItem):
    sqlmodel = Table('mytable', metadata
        Column('id', Integer, primary_key=True),
        Column('name', String, nullable=False))

It also has a SqlSpider that automatically saved scraped items to the 
database (via the item_scraped signal).

There are some some settings to push items to the database in chunks rather 
than item by item.

DEFAULT_CHUNKSIZE = 500
CHUNKSIZE_BY_TABLE = {'mytable': 1000, 'othertable': 250}

-- 
You received this message because you are subscribed to the Google Groups 
"scrapy-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/scrapy-users.
For more options, visit https://groups.google.com/d/optout.

Reply via email to